Feature: Appview v2 ()

* add buf & connectrpc, codegen client

* lint

* prettier ignore

* fix prettier ignore

* tidy & add tests

* filler commit

* rm filler

* server boilerplate

* follows impl

* posts impl

* posts & likes impl

* repost impl

* profiles & handle null values

* list impl

* mutes impl

* blocks impl

* misc

* feed gen impl

* label impl

* notifs impl

* feeds impl

* threads impl

* early sketchwork

* wip

* stub out thick client

* in-progress work on hydrator

* tweak

* hydrate profile labels, detail lists

* feedgen hydration

* protobuf tweaks

* more protobuf tweaks

* wip

* snake case

* moar snake case

* tidy actor hydration

* tidy parsing

* type fixes, renaming, comments in hydrator

* hydrate list items and likes

* hydrate notifications

* feed hydration

* graph & label hydration

* more record protobufs

* pluralize

* tweak pbs

* use new methods

* Setup dataplane grpc client/mock server ()

* add buf & connectrpc, codegen client

* lint

* prettier ignore

* fix prettier ignore

* tidy & add tests

* add record getter mocks

* post view hydration

* fix up mock dataplane to match new protos

* missed one

* wire up dataplane in ctx & dev-env

* adding some basic views

* feed hydration, add blocks to post hydration

* pass over notification hydration

* tidy

* merge

* implement getProfile

* hydrate post aggregation and viewer state

* fix

* fix codegen

* get some tests passing!

* add takedowns & some like bugfixing

* all profile tests passing!

* likes test

* follow endpoints using data plane

* reorg follow block rules

* reposts

* post views!

* implement getList w/ dataplane caveat

* adjust dataplane getListMembers to return listitem uris

* implement getListMutes and -Blocks w/ dataplane

* suggestions

* timeline

* misc view fixes

* view fixes for mutes, self-mute/block

* author feed

* feed gen routes

* tidy

* misc block/mute fixes

* list feed & actor likes

* implement getLists, fix some empty cursors

* implement getMutes, empty profile description fix

* implement getBlocks, block application fix

* implement getSuggestedFollowsByActor, needs some fixes

* feed generation

* search routes

* threads

* tidy

* fix some snaps

* fix getSuggestedFollowsByActor

* implement listNotifications

* implement getUnreadCount w/ dataplane

* implement notifications.updateSeen w/ dataplane

* 3rd party blocking tests

* blocked profile viewer

* add search mocks

* refactor getFeed

* createPipeline -> createPipelineNew

* basic replygating functionality on dataplane w/o filtering violating replies

* hack threadgates into dataplane, apply gates

* deterministic thread orders in dataplane

* misc cleanup around dataplane

* upgrade typescript to v5.3

* update typescript linter deps

* sync bsky proto, codegen

* update dataplane, sync with bsky proto updates

* remove indexer, ingester, daemon, moderation services from appview

* convert more bsky internals to dataplane, remove custom feedgens, implement mute/unmuting in mock dataplane

* remove bsky services. db and indexing logic into mock dataplane.

* remove tests not needed by appview v2, misc reorg

* add basic in-mem repo subscription to dataplane mock

* fix dev-env, bsky tests, bsky build

* cull bsky service entrypoint

* add bsky service readme

* build

* tidy

* tidy, fix pds proxy tests

* fix

* fix bsky entrypoint deps

* support http2 grpc client

* build

* fix dataplane bad tls config/default

* support multiple dataplane urls, retry when unavailable

* build

* tidy/fix

* move dataplane mock tests into their own dir

* cover label hydration through timeline test

* bring back labels in appview tests

* remove unused db primary/replica/coordinator from bsky dataplane

* bsky proto add cids to contracts, buf codegen

* sync-up bsky data-plane w/ codegen updates

* start using dataplane interaction endpoints

* add file

* avoid overfetching from dataplane, plumb feed items and cids

* pass refs through for post viewer state

* switch list feeds to use feed item in dataplane

* handle not found err on get-thread dataplane call

* support use of search service rather than dataplane methods

* mark some appview v2 todos

* tidy

* still use dataplane on search endpoints when search service is not configured

* fix pds test

* fix up bsky tests & snaps

* tidy migrations

* fix appview-v2 docker build

* Support label issuer tied to appview v2 ()

support label issuer tied to appview

* Appview v2: handle empty cursor on list notifications ()

handle empty cursor on appview listnotifs

* Update appview v2 to use author feed enum ()

* update bsky protos with author feed enum, misc feed item changes

* support new author feed enums in dataplane

* fix build

* Appview v2: utilize sorted-at field in bsky protos ()

utilize new sorted-at field in bsky protos

* remove all dataplane usage of GetLikeCounts, switch to GetInteractionCounts

* Appview v2, sync w/ changes to protos ()

* sync bsky protos

* sync-up bsky implementation w/ proto changes

* Appview v2 initial implementation for getPopularFeedGenerators ()

add an initial implementation for getPopularFeedGenerators on appview v2

* merge

* fixes

* fix feed tests

* fix bsync mock

* format

* remove unused config

* fix lockfile

* another lockfile fix

* fix duplicate type

* fix dupplicate test

* Appview v2 handling clearly bad cursors ()

* make mock dataplane cursors different from v1 cursors

* fail open on clearly bad appview cursors

* fix pds appview proxy snaps

* Appview v2 no notifs seen behavior ()

* alter behavior for presenting notifications w/ no last-seen time

* fix pds proxy tests

* Appview v2 dataplane retries based on client host ()

choose dataplane client for retries based on host when possible/relevant

* don't apply negated labels

* display suspensions on actor profile in appview v2

* Appview v2 use dataplane for identity lookups ()

* update bsky proto w/ identity methods

* setup identity endpoints on mock dataplane

* move from idresolver to dataplane for identity lookups on appview

* tidy

* Appview v2: apply safe takedown refs to records, actors ()

apply safe takedown refs to records, actors

* Fix timing on appview v2 repo rev header ()

fix timing on appview repo rev

* fix post thread responses

* Appview v2 don't apply 3p self blocks ()

do not apply 3p self-blocks

* Appview v2 search for feed generators ()

* add protos for feedgen search

* support feed search on getPopularFeedGenerators

* Appview v2 config tidy ()

* remove mod and triage roles from appview

* rename cdn and search config

* remove custom feed harness from appview v2

* Appview v2: don't apply missing modlists ()

* dont apply missing mod lists

* update mock dataplane

* Update packages/bsky/src/hydration/hydrator.ts

Co-authored-by: devin ivy <devinivy@gmail.com>

* refactor & document a bit better

* fix up other routes

---------

Co-authored-by: devin ivy <devinivy@gmail.com>

* Appview v2 enforce post thread root boundary ()

* enforce post thread root boundary

* test thread root boundary

* Appview v2 fix admin environment variable ()

fix admin env in appview v2

* Remove re-pagination from getSuggestions ()

* remove re-pagination from getSuggestions

* fix test

* Adjust wording for account suspension ()

adjust wording for account suspension

* Appview v2: fix not-found and blocked uris in threads ()

* fix uris of not-found and blocked posts in threads

* update snaps

*  Show author feed of takendown author to admins only ()

* fold in cid, auth, tracing, node version changes

* remove dead config from bsky service entrypoint

* build

* remove ozone test codepaths for appview v2

* tidy, docs fix

---------

Co-authored-by: Devin Ivy <devinivy@gmail.com>
Co-authored-by: Foysal Ahamed <foysal@blueskyweb.xyz>
This commit is contained in:
Daniel Holmgren 2024-02-27 14:22:55 -06:00 committed by GitHub
parent 53122b4b3e
commit f65de89eed
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
322 changed files with 21419 additions and 14016 deletions
.github/workflows
.prettierignore
packages/bsky
bin
build.jspackage.json
proto
src
api
auth-verifier.ts
auto-moderator
config.tscontext.ts
daemon
data-plane

@ -3,7 +3,6 @@ on:
push:
branches:
- main
- appview-v1-courier
env:
REGISTRY: ${{ secrets.AWS_ECR_REGISTRY_USEAST2_PACKAGES_REGISTRY }}
USERNAME: ${{ secrets.AWS_ECR_REGISTRY_USEAST2_PACKAGES_USERNAME }}

@ -3,6 +3,7 @@ on:
push:
branches:
- main
- appview-v2
env:
REGISTRY: ghcr.io
USERNAME: ${{ github.actor }}

@ -8,3 +8,4 @@ pnpm-lock.yaml
.pnpm*
.changeset
*.d.ts
packages/bsky/src/data-plane/gen

@ -14,7 +14,15 @@ export async function main() {
)
}
const filename = `${prefix}-${name}`
const dir = path.join(__dirname, '..', 'src', 'db', 'migrations')
const dir = path.join(
__dirname,
'..',
'src',
'data-plane',
'server',
'db',
'migrations',
)
await fs.writeFile(path.join(dir, `${filename}.ts`), template, { flag: 'wx' })
await fs.writeFile(

@ -5,7 +5,7 @@ const buildShallow =
require('esbuild').build({
logLevel: 'info',
entryPoints: ['src/index.ts', 'src/db/index.ts'],
entryPoints: ['src/index.ts'],
bundle: true,
sourcemap: true,
outdir: 'dist',

@ -42,6 +42,7 @@
"@atproto/xrpc-server": "workspace:^",
"@bufbuild/protobuf": "^1.5.0",
"@connectrpc/connect": "^1.1.4",
"@connectrpc/connect-express": "^1.1.4",
"@connectrpc/connect-node": "^1.1.4",
"@did-plc/lib": "^0.0.1",
"@isaacs/ttlcache": "^1.4.1",
@ -80,6 +81,7 @@
"@types/express-serve-static-core": "^4.17.36",
"@types/pg": "^8.6.6",
"@types/qs": "^6.9.7",
"axios": "^0.27.2"
"axios": "^0.27.2",
"http2-express-bridge": "^1.0.7"
}
}

File diff suppressed because it is too large Load Diff

@ -1,108 +1,85 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/actor/getProfile'
import { softDeleted } from '../../../../db/util'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { Actor } from '../../../../db/tables/actor'
import {
ActorService,
ProfileDetailHydrationState,
} from '../../../../services/actor'
import { setRepoRev } from '../../../util'
import { createPipeline, noRules } from '../../../../pipeline'
import { ModerationService } from '../../../../services/moderation'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
export default function (server: Server, ctx: AppContext) {
const getProfile = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.actor.getProfile({
auth: ctx.authVerifier.optionalStandardOrRole,
handler: async ({ auth, params, res }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const modService = ctx.services.moderation(ctx.db.getPrimary())
const { viewer, canViewTakedowns } = ctx.authVerifier.parseCreds(auth)
const [result, repoRev] = await Promise.allSettled([
getProfile(
{ ...params, viewer, canViewTakedowns },
{ db, actorService, modService },
),
actorService.getRepoRev(viewer),
])
const result = await getProfile(
{ ...params, viewer, canViewTakedowns },
ctx,
)
if (repoRev.status === 'fulfilled') {
setRepoRev(res, repoRev.value)
}
if (result.status === 'rejected') {
throw result.reason
}
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
encoding: 'application/json',
body: result.value,
body: result,
}
},
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { actorService } = ctx
const { canViewTakedowns } = params
const actor = await actorService.getActor(params.actor, true)
if (!actor) {
const skeleton = async (input: {
ctx: Context
params: Params
}): Promise<SkeletonState> => {
const { ctx, params } = input
const [did] = await ctx.hydrator.actor.getDids([params.actor])
if (!did) {
throw new InvalidRequestError('Profile not found')
}
if (!canViewTakedowns && softDeleted(actor)) {
if (actor.takedownRef?.includes('SUSPEND')) {
throw new InvalidRequestError(
'Account has been temporarily suspended',
'AccountTakedown',
)
} else {
throw new InvalidRequestError(
'Account has been taken down',
'AccountTakedown',
)
}
}
return { params, actor }
return { did }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { actorService } = ctx
const { params, actor } = state
const { viewer, canViewTakedowns } = params
const hydration = await actorService.views.profileDetailHydration(
[actor.did],
{ viewer, includeSoftDeleted: canViewTakedowns },
const hydration = async (input: {
ctx: Context
params: Params
skeleton: SkeletonState
}) => {
const { ctx, params, skeleton } = input
return ctx.hydrator.hydrateProfilesDetailed(
[skeleton.did],
params.viewer,
true,
)
return { ...state, ...hydration }
}
const presentation = (state: HydrationState, ctx: Context) => {
const { actorService } = ctx
const { params, actor } = state
const { viewer } = params
const profiles = actorService.views.profileDetailPresentation(
[actor.did],
state,
{ viewer },
)
const profile = profiles[actor.did]
const presentation = (input: {
ctx: Context
params: Params
skeleton: SkeletonState
hydration: HydrationState
}) => {
const { ctx, params, skeleton, hydration } = input
const profile = ctx.views.profileDetailed(skeleton.did, hydration)
if (!profile) {
throw new InvalidRequestError('Profile not found')
} else if (
!params.canViewTakedowns &&
ctx.views.actorIsTakendown(skeleton.did, hydration)
) {
throw new InvalidRequestError(
'Account has been suspended',
'AccountTakedown',
)
}
return profile
}
type Context = {
db: Database
actorService: ActorService
modService: ModerationService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -110,6 +87,4 @@ type Params = QueryParams & {
canViewTakedowns: boolean
}
type SkeletonState = { params: Params; actor: Actor }
type HydrationState = SkeletonState & ProfileDetailHydrationState
type SkeletonState = { did: string }

@ -2,28 +2,21 @@ import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/actor/getProfiles'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import {
ActorService,
ProfileDetailHydrationState,
} from '../../../../services/actor'
import { setRepoRev } from '../../../util'
import { createPipeline, noRules } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
export default function (server: Server, ctx: AppContext) {
const getProfile = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.actor.getProfiles({
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth, params, res }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const viewer = auth.credentials.iss
const [result, repoRev] = await Promise.all([
getProfile({ ...params, viewer }, { db, actorService }),
actorService.getRepoRev(viewer),
])
const result = await getProfile({ ...params, viewer }, ctx)
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
@ -34,45 +27,44 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { actorService } = ctx
const actors = await actorService.getActors(params.actors)
return { params, dids: actors.map((a) => a.did) }
const skeleton = async (input: {
ctx: Context
params: Params
}): Promise<SkeletonState> => {
const { ctx, params } = input
const dids = await ctx.hydrator.actor.getDidsDefined(params.actors)
return { dids }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { actorService } = ctx
const { params, dids } = state
const { viewer } = params
const hydration = await actorService.views.profileDetailHydration(dids, {
viewer,
})
return { ...state, ...hydration }
const hydration = async (input: {
ctx: Context
params: Params
skeleton: SkeletonState
}) => {
const { ctx, params, skeleton } = input
return ctx.hydrator.hydrateProfilesDetailed(skeleton.dids, params.viewer)
}
const presentation = (state: HydrationState, ctx: Context) => {
const { actorService } = ctx
const { params, dids } = state
const { viewer } = params
const profiles = actorService.views.profileDetailPresentation(dids, state, {
viewer,
})
const profileViews = mapDefined(dids, (did) => profiles[did])
return { profiles: profileViews }
const presentation = (input: {
ctx: Context
params: Params
skeleton: SkeletonState
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = input
const profiles = mapDefined(skeleton.dids, (did) =>
ctx.views.profileDetailed(did, hydration),
)
return { profiles }
}
type Context = {
db: Database
actorService: ActorService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string | null
}
type SkeletonState = { params: Params; dids: string[] }
type HydrationState = SkeletonState & ProfileDetailHydrationState
type SkeletonState = { dids: string[] }

@ -1,13 +1,12 @@
import { mapDefined } from '@atproto/common'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { Actor } from '../../../../db/tables/actor'
import { notSoftDeletedClause } from '../../../../db/util'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/actor/getSuggestions'
import { createPipeline } from '../../../../pipeline'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
export default function (server: Server, ctx: AppContext) {
const getSuggestions = createPipeline(
@ -19,15 +18,8 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.actor.getSuggestions({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const viewer = auth.credentials.iss
const result = await getSuggestions(
{ ...params, viewer },
{ db, actorService, graphService },
)
const result = await getSuggestions({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
@ -37,114 +29,80 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db } = ctx
const { viewer } = params
const alreadyIncluded = parseCursor(params.cursor) // @NOTE handles bad cursor e.g. on appview swap
const { ref } = db.db.dynamic
const suggestions = await db.db
.selectFrom('suggested_follow')
.innerJoin('actor', 'actor.did', 'suggested_follow.did')
.where(notSoftDeletedClause(ref('actor')))
.where('suggested_follow.did', '!=', viewer ?? '')
.whereNotExists((qb) =>
qb
.selectFrom('follow')
.selectAll()
.where('creator', '=', viewer ?? '')
.whereRef('subjectDid', '=', ref('actor.did')),
)
.if(alreadyIncluded.length > 0, (qb) =>
qb.where('suggested_follow.order', 'not in', alreadyIncluded),
)
.selectAll()
.orderBy('suggested_follow.order', 'asc')
.execute()
// always include first two
const firstTwo = suggestions.filter(
(row) => row.order === 1 || row.order === 2,
)
const rest = suggestions.filter((row) => row.order !== 1 && row.order !== 2)
const limited = firstTwo.concat(shuffle(rest)).slice(0, params.limit)
// if the result set ends up getting larger, consider using a seed included in the cursor for for the randomized shuffle
const cursor =
limited.length > 0
? limited
.map((row) => row.order.toString())
.concat(alreadyIncluded.map((id) => id.toString()))
.join(':')
: undefined
return { params, suggestions: limited, cursor }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService } = ctx
const { params, suggestions } = state
const { viewer } = params
const [actors, bam] = await Promise.all([
actorService.views.profiles(suggestions, viewer),
graphService.getBlockAndMuteState(
viewer ? suggestions.map((sug) => [viewer, sug.did]) : [],
),
])
return { ...state, bam, actors }
}
const noBlocksOrMutes = (state: HydrationState) => {
const { viewer } = state.params
if (!viewer) return state
state.suggestions = state.suggestions.filter(
(item) =>
!state.bam.block([viewer, item.did]) &&
!state.bam.mute([viewer, item.did]),
)
return state
}
const presentation = (state: HydrationState) => {
const { suggestions, actors, cursor } = state
const suggestedActors = mapDefined(suggestions, (sug) => actors[sug.did])
return { actors: suggestedActors, cursor }
}
const parseCursor = (cursor?: string): number[] => {
if (!cursor) {
return []
}
try {
return cursor
.split(':')
.map((id) => parseInt(id, 10))
.filter((id) => !isNaN(id))
} catch {
return []
const skeleton = async (input: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = input
// @NOTE for appview swap moving to rkey-based cursors which are somewhat permissive, should not hard-break pagination
const suggestions = await ctx.dataplane.getFollowSuggestions({
actorDid: params.viewer ?? undefined,
cursor: params.cursor,
limit: params.limit,
})
let dids = suggestions.dids
if (params.viewer !== null) {
const follows = await ctx.dataplane.getActorFollowsActors({
actorDid: params.viewer,
targetDids: dids,
})
dids = dids.filter((did, i) => !follows.uris[i] && did !== params.viewer)
}
return { dids, cursor: parseString(suggestions.cursor) }
}
const shuffle = <T>(arr: T[]): T[] => {
return arr
.map((value) => ({ value, sort: Math.random() }))
.sort((a, b) => a.sort - b.sort)
.map(({ value }) => value)
const hydration = async (input: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = input
return ctx.hydrator.hydrateProfilesDetailed(
skeleton.dids,
params.viewer,
true,
)
}
const noBlocksOrMutes = (input: {
ctx: Context
params: Params
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = input
skeleton.dids = skeleton.dids.filter(
(did) =>
!ctx.views.viewerBlockExists(did, hydration) &&
!ctx.views.viewerMuteExists(did, hydration),
)
return skeleton
}
const presentation = (input: {
ctx: Context
params: Params
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = input
const actors = mapDefined(skeleton.dids, (did) =>
ctx.views.profile(did, hydration),
)
return {
actors,
cursor: skeleton.cursor,
}
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
dataplane: DataPlaneClient
hydrator: Hydrator
views: Views
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = { params: Params; suggestions: Actor[]; cursor?: string }
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
type Params = QueryParams & {
viewer: string | null
}
type Skeleton = { dids: string[]; cursor?: string }

@ -1,56 +1,111 @@
import AppContext from '../../../../context'
import { Server } from '../../../../lexicon'
import { cleanQuery } from '../../../../services/util/search'
import { mapDefined } from '@atproto/common'
import AtpAgent from '@atproto/api'
import { QueryParams } from '../../../../lexicon/types/app/bsky/actor/searchActors'
import {
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
export default function (server: Server, ctx: AppContext) {
const searchActors = createPipeline(
skeleton,
hydration,
noBlocks,
presentation,
)
server.app.bsky.actor.searchActors({
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth, params }) => {
const { cursor, limit } = params
const requester = auth.credentials.iss
const rawQuery = params.q ?? params.term
const query = cleanQuery(rawQuery || '')
const db = ctx.db.getReplica('search')
let results: string[]
let resCursor: string | undefined
if (ctx.searchAgent) {
// @NOTE cursors wont change on appview swap
const res =
await ctx.searchAgent.api.app.bsky.unspecced.searchActorsSkeleton({
q: query,
cursor,
limit,
})
results = res.data.actors.map((a) => a.did)
resCursor = res.data.cursor
} else {
const res = await ctx.services
.actor(ctx.db.getReplica('search'))
.getSearchResults({ query, limit, cursor })
results = res.results.map((a) => a.did)
resCursor = res.cursor
}
const actors = await ctx.services
.actor(db)
.views.profiles(results, requester)
const SKIP = []
const filtered = results.flatMap((did) => {
const actor = actors[did]
if (!actor) return SKIP
if (actor.viewer?.blocking || actor.viewer?.blockedBy) return SKIP
return actor
})
const viewer = auth.credentials.iss
const results = await searchActors({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
cursor: resCursor,
actors: filtered,
},
body: results,
}
},
})
}
const skeleton = async (inputs: SkeletonFnInput<Context, Params>) => {
const { ctx, params } = inputs
const term = params.q ?? params.term ?? ''
// @TODO
// add hits total
if (ctx.searchAgent) {
// @NOTE cursors wont change on appview swap
const { data: res } =
await ctx.searchAgent.api.app.bsky.unspecced.searchActorsSkeleton({
q: term,
cursor: params.cursor,
limit: params.limit,
})
return {
dids: res.actors.map(({ did }) => did),
cursor: parseString(res.cursor),
}
}
const res = await ctx.dataplane.searchActors({
term,
limit: params.limit,
cursor: params.cursor,
})
return {
dids: res.dids,
cursor: parseString(res.cursor),
}
}
const hydration = async (
inputs: HydrationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydrateProfiles(skeleton.dids, params.viewer)
}
const noBlocks = (inputs: RulesFnInput<Context, Params, Skeleton>) => {
const { ctx, skeleton, hydration } = inputs
skeleton.dids = skeleton.dids.filter(
(did) => !ctx.views.viewerBlockExists(did, hydration),
)
return skeleton
}
const presentation = (
inputs: PresentationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, skeleton, hydration } = inputs
const actors = mapDefined(skeleton.dids, (did) =>
ctx.views.profile(did, hydration),
)
return {
actors,
cursor: skeleton.cursor,
}
}
type Context = {
dataplane: DataPlaneClient
hydrator: Hydrator
views: Views
searchAgent?: AtpAgent
}
type Params = QueryParams & { viewer: string | null }
type Skeleton = {
dids: string[]
hitsTotal?: number
cursor?: string
}

@ -1,56 +1,107 @@
import AppContext from '../../../../context'
import { Server } from '../../../../lexicon'
import AtpAgent from '@atproto/api'
import { mapDefined } from '@atproto/common'
import { QueryParams } from '../../../../lexicon/types/app/bsky/actor/searchActorsTypeahead'
import {
cleanQuery,
getUserSearchQuerySimple,
} from '../../../../services/util/search'
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
export default function (server: Server, ctx: AppContext) {
const searchActorsTypeahead = createPipeline(
skeleton,
hydration,
noBlocks,
presentation,
)
server.app.bsky.actor.searchActorsTypeahead({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const { limit } = params
const requester = auth.credentials.iss
const rawQuery = params.q ?? params.term
const query = cleanQuery(rawQuery || '')
const db = ctx.db.getReplica('search')
let results: string[]
if (ctx.searchAgent) {
const res =
await ctx.searchAgent.api.app.bsky.unspecced.searchActorsSkeleton({
q: query,
typeahead: true,
limit,
})
results = res.data.actors.map((a) => a.did)
} else {
const res = query
? await getUserSearchQuerySimple(db, { query, limit })
.selectAll('actor')
.execute()
: []
results = res.map((a) => a.did)
}
const actors = await ctx.services
.actor(db)
.views.profilesBasic(results, requester)
const SKIP = []
const filtered = results.flatMap((did) => {
const actor = actors[did]
if (!actor) return SKIP
if (actor.viewer?.blocking || actor.viewer?.blockedBy) return SKIP
return actor
})
const viewer = auth.credentials.iss
const results = await searchActorsTypeahead({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
actors: filtered,
},
body: results,
}
},
})
}
const skeleton = async (inputs: SkeletonFnInput<Context, Params>) => {
const { ctx, params } = inputs
const term = params.q ?? params.term ?? ''
// @TODO
// add typeahead option
// add hits total
if (ctx.searchAgent) {
const { data: res } =
await ctx.searchAgent.api.app.bsky.unspecced.searchActorsSkeleton({
typeahead: true,
q: term,
limit: params.limit,
})
return {
dids: res.actors.map(({ did }) => did),
cursor: parseString(res.cursor),
}
}
const res = await ctx.dataplane.searchActors({
term,
limit: params.limit,
})
return {
dids: res.dids,
cursor: parseString(res.cursor),
}
}
const hydration = async (
inputs: HydrationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydrateProfilesBasic(skeleton.dids, params.viewer)
}
const noBlocks = (inputs: RulesFnInput<Context, Params, Skeleton>) => {
const { ctx, skeleton, hydration } = inputs
skeleton.dids = skeleton.dids.filter(
(did) => !ctx.views.viewerBlockExists(did, hydration),
)
return skeleton
}
const presentation = (
inputs: PresentationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, skeleton, hydration } = inputs
const actors = mapDefined(skeleton.dids, (did) =>
ctx.views.profileBasic(did, hydration),
)
return {
actors,
}
}
type Context = {
dataplane: DataPlaneClient
hydrator: Hydrator
views: Views
searchAgent?: AtpAgent
}
type Params = QueryParams & { viewer: string | null }
type Skeleton = {
dids: string[]
}

@ -1,69 +1,91 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getActorFeeds'
import AppContext from '../../../../context'
import { TimeCidKeyset, paginate } from '../../../../db/pagination'
import { createPipeline, noRules } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getActorFeeds = createPipeline(
skeleton,
hydration,
noRules,
presentation,
)
server.app.bsky.feed.getActorFeeds({
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth, params }) => {
const { actor, limit, cursor } = params
const viewer = auth.credentials.iss
if (TimeCidKeyset.clearlyBad(cursor)) {
return {
encoding: 'application/json',
body: { feeds: [] },
}
}
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const feedService = ctx.services.feed(db)
const creatorRes = await actorService.getActor(actor)
if (!creatorRes) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
const { ref } = db.db.dynamic
let feedsQb = feedService
.selectFeedGeneratorQb(viewer)
.where('feed_generator.creator', '=', creatorRes.did)
const keyset = new TimeCidKeyset(
ref('feed_generator.createdAt'),
ref('feed_generator.cid'),
)
feedsQb = paginate(feedsQb, {
limit,
cursor,
keyset,
})
const [feedsRes, profiles] = await Promise.all([
feedsQb.execute(),
actorService.views.profiles([creatorRes], viewer),
])
if (!profiles[creatorRes.did]) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
const feeds = mapDefined(feedsRes, (row) => {
const feed = {
...row,
viewer: viewer ? { like: row.viewerLike } : undefined,
}
return feedService.views.formatFeedGeneratorView(feed, profiles)
})
const result = await getActorFeeds({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
cursor: keyset.packFromResult(feedsRes),
feeds,
},
body: result,
}
},
})
}
const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
if (clearlyBadCursor(params.cursor)) {
return { feedUris: [] }
}
const [did] = await ctx.hydrator.actor.getDids([params.actor])
if (!did) {
throw new InvalidRequestError('Profile not found')
}
const feedsRes = await ctx.dataplane.getActorFeeds({
actorDid: did,
cursor: params.cursor,
limit: params.limit,
})
return {
feedUris: feedsRes.uris,
cursor: parseString(feedsRes.cursor),
}
}
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return await ctx.hydrator.hydrateFeedGens(skeleton.feedUris, params.viewer)
}
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feeds = mapDefined(skeleton.feedUris, (uri) =>
ctx.views.feedGenerator(uri, hydration),
)
return {
feeds,
cursor: skeleton.cursor,
}
}
type Context = {
hydrator: Hydrator
views: Views
dataplane: DataPlaneClient
}
type Params = QueryParams & { viewer: string | null }
type Skeleton = {
feedUris: string[]
cursor?: string
}

@ -1,19 +1,16 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getActorLikes'
import { FeedKeyset } from '../util/feed'
import { paginate } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { setRepoRev } from '../../../util'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { Database } from '../../../../db'
import { ActorService } from '../../../../services/actor'
import { GraphService } from '../../../../services/graph'
import { clearlyBadCursor, setRepoRev } from '../../../util'
import { createPipeline } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
import { creatorFromUri } from '../../../../views/util'
import { FeedItem } from '../../../../hydration/feed'
export default function (server: Server, ctx: AppContext) {
const getActorLikes = createPipeline(
@ -26,19 +23,10 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth, res }) => {
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const feedService = ctx.services.feed(db)
const graphService = ctx.services.graph(db)
const [result, repoRev] = await Promise.all([
getActorLikes(
{ ...params, viewer },
{ db, actorService, feedService, graphService },
),
actorService.getRepoRev(viewer),
])
const result = await getActorLikes({ ...params, viewer }, ctx)
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
@ -49,81 +37,80 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db, actorService, feedService } = ctx
const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
const { actor, limit, cursor, viewer } = params
const { ref } = db.db.dynamic
const actorRes = await actorService.getActor(actor)
if (!actorRes) {
throw new InvalidRequestError('Profile not found')
if (clearlyBadCursor(cursor)) {
return { items: [] }
}
const actorDid = actorRes.did
if (!viewer || viewer !== actorDid) {
const [actorDid] = await ctx.hydrator.actor.getDids([actor])
if (!actorDid || !viewer || viewer !== actorDid) {
throw new InvalidRequestError('Profile not found')
}
if (FeedKeyset.clearlyBad(cursor)) {
return { params, feedItems: [] }
}
let feedItemsQb = feedService
.selectFeedItemQb()
.innerJoin('like', 'like.subject', 'feed_item.uri')
.where('like.creator', '=', actorDid)
const keyset = new FeedKeyset(ref('like.sortAt'), ref('like.cid'))
feedItemsQb = paginate(feedItemsQb, {
const likesRes = await ctx.dataplane.getActorLikes({
actorDid,
limit,
cursor,
keyset,
})
const feedItems = await feedItemsQb.execute()
const items = likesRes.likes.map((l) => ({ post: { uri: l.subject } }))
return { params, feedItems, cursor: keyset.packFromResult(feedItems) }
return {
items,
cursor: parseString(likesRes.cursor),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return await ctx.hydrator.hydrateFeedItems(skeleton.items, params.viewer)
}
const noPostBlocks = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
skeleton.items = skeleton.items.filter((item) => {
const creator = creatorFromUri(item.post.uri)
return !ctx.views.viewerBlockExists(creator, hydration)
})
return { ...state, ...hydrated }
return skeleton
}
const noPostBlocks = (state: HydrationState) => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter(
(item) => !viewer || !state.bam.block([viewer, item.postAuthorDid]),
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feed = mapDefined(skeleton.items, (item) =>
ctx.views.feedViewPost(item, hydration),
)
return state
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const { feedItems, cursor, params } = state
const feed = feedService.views.formatFeed(feedItems, state, params.viewer)
return { feed, cursor }
return {
feed,
cursor: skeleton.cursor,
}
}
type Context = {
db: Database
feedService: FeedService
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
dataplane: DataPlaneClient
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = { params: Params; feedItems: FeedRow[]; cursor?: string }
type HydrationState = SkeletonState & FeedHydrationState
type Skeleton = {
items: FeedItem[]
cursor?: string
}

@ -1,19 +1,21 @@
import { mapDefined } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getAuthorFeed'
import { FeedKeyset } from '../util/feed'
import { paginate } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { setRepoRev } from '../../../util'
import { Database } from '../../../../db'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { ActorService } from '../../../../services/actor'
import { GraphService } from '../../../../services/graph'
import { clearlyBadCursor, setRepoRev } from '../../../util'
import { createPipeline } from '../../../../pipeline'
import {
HydrationState,
Hydrator,
mergeStates,
} from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
import { Actor } from '../../../../hydration/actor'
import { FeedItem } from '../../../../hydration/feed'
import { FeedType } from '../../../../proto/bsky_pb'
export default function (server: Server, ctx: AppContext) {
const getAuthorFeed = createPipeline(
@ -25,24 +27,14 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getAuthorFeed({
auth: ctx.authVerifier.optionalStandardOrRole,
handler: async ({ params, auth, res }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const feedService = ctx.services.feed(db)
const graphService = ctx.services.graph(db)
const { viewer } = ctx.authVerifier.parseCreds(auth)
const { viewer, canViewTakedowns } = ctx.authVerifier.parseCreds(auth)
const [result, repoRev] = await Promise.all([
getAuthorFeed(
{
...params,
includeSoftDeleted: auth.credentials.type === 'role',
viewer,
},
{ db, actorService, feedService, graphService },
),
actorService.getRepoRev(viewer),
])
const result = await getAuthorFeed(
{ ...params, viewer, includeTakedowns: canViewTakedowns },
ctx,
)
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
@ -53,135 +45,122 @@ export default function (server: Server, ctx: AppContext) {
})
}
export const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { cursor, limit, actor, filter, viewer, includeSoftDeleted } = params
const { db, actorService, feedService, graphService } = ctx
const { ref } = db.db.dynamic
const FILTER_TO_FEED_TYPE = {
posts_with_replies: undefined, // default: all posts, replies, and reposts
posts_no_replies: FeedType.POSTS_NO_REPLIES,
posts_with_media: FeedType.POSTS_WITH_MEDIA,
posts_and_author_threads: FeedType.POSTS_AND_AUTHOR_THREADS,
}
// maybe resolve did first
const actorRes = await actorService.getActor(actor, includeSoftDeleted)
if (!actorRes) {
export const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
const [did] = await ctx.hydrator.actor.getDids([params.actor])
if (!did) {
throw new InvalidRequestError('Profile not found')
}
const actorDid = actorRes.did
// verify there is not a block between requester & subject
if (viewer !== null) {
const blocks = await graphService.getBlockState([[viewer, actorDid]])
if (blocks.blocking([viewer, actorDid])) {
throw new InvalidRequestError(
`Requester has blocked actor: ${actor}`,
'BlockedActor',
)
}
if (blocks.blockedBy([viewer, actorDid])) {
throw new InvalidRequestError(
`Requester is blocked by actor: $${actor}`,
'BlockedByActor',
)
}
const actors = await ctx.hydrator.actor.getActors(
[did],
params.includeTakedowns,
)
const actor = actors.get(did)
if (!actor) {
throw new InvalidRequestError('Profile not found')
}
if (FeedKeyset.clearlyBad(cursor)) {
return { params, feedItems: [] }
if (clearlyBadCursor(params.cursor)) {
return { actor, items: [] }
}
// defaults to posts, reposts, and replies
let feedItemsQb = feedService
.selectFeedItemQb()
.where('originatorDid', '=', actorDid)
if (filter === 'posts_with_media') {
feedItemsQb = feedItemsQb
// only your own posts
.where('type', '=', 'post')
// only posts with media
.whereExists((qb) =>
qb
.selectFrom('post_embed_image')
.select('post_embed_image.postUri')
.whereRef('post_embed_image.postUri', '=', 'feed_item.postUri'),
)
} else if (filter === 'posts_no_replies') {
feedItemsQb = feedItemsQb.where((qb) =>
qb.where('post.replyParent', 'is', null).orWhere('type', '=', 'repost'),
)
} else if (filter === 'posts_and_author_threads') {
feedItemsQb = feedItemsQb.where((qb) =>
qb
.where('type', '=', 'repost')
.orWhere('post.replyParent', 'is', null)
.orWhere('post.replyRoot', 'like', `at://${actorDid}/%`),
)
}
const keyset = new FeedKeyset(ref('feed_item.sortAt'), ref('feed_item.cid'))
feedItemsQb = paginate(feedItemsQb, {
limit,
cursor,
keyset,
const res = await ctx.dataplane.getAuthorFeed({
actorDid: did,
limit: params.limit,
cursor: params.cursor,
feedType: FILTER_TO_FEED_TYPE[params.filter],
})
const feedItems = await feedItemsQb.execute()
return {
params,
feedItems,
cursor: keyset.packFromResult(feedItems),
actor,
items: res.items.map((item) => ({
post: { uri: item.uri, cid: item.cid || undefined },
repost: item.repost
? { uri: item.repost, cid: item.repostCid || undefined }
: undefined,
})),
cursor: parseString(res.cursor),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
includeSoftDeleted: params.includeSoftDeleted,
})
return { ...state, ...hydrated }
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}): Promise<HydrationState> => {
const { ctx, params, skeleton } = inputs
const [feedPostState, profileViewerState = {}] = await Promise.all([
ctx.hydrator.hydrateFeedItems(
skeleton.items,
params.viewer,
params.includeTakedowns,
),
params.viewer
? ctx.hydrator.hydrateProfileViewers([skeleton.actor.did], params.viewer)
: undefined,
])
return mergeStates(feedPostState, profileViewerState)
}
const noBlocksOrMutedReposts = (state: HydrationState) => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter((item) => {
if (!viewer) return true
const noBlocksOrMutedReposts = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}): Skeleton => {
const { ctx, skeleton, hydration } = inputs
const relationship = hydration.profileViewers?.get(skeleton.actor.did)
if (relationship?.blocking || relationship?.blockingByList) {
throw new InvalidRequestError(
`Requester has blocked actor: ${skeleton.actor.did}`,
'BlockedActor',
)
}
if (relationship?.blockedBy || relationship?.blockedByList) {
throw new InvalidRequestError(
`Requester is blocked by actor: ${skeleton.actor.did}`,
'BlockedByActor',
)
}
skeleton.items = skeleton.items.filter((item) => {
const bam = ctx.views.feedItemBlocksAndMutes(item, hydration)
return (
!state.bam.block([viewer, item.postAuthorDid]) &&
(item.type === 'post' || !state.bam.mute([viewer, item.postAuthorDid]))
!bam.authorBlocked &&
!bam.originatorBlocked &&
!(bam.authorMuted && !bam.originatorMuted)
)
})
return state
return skeleton
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const { feedItems, cursor, params } = state
const feed = feedService.views.formatFeed(feedItems, state, params.viewer)
return { feed, cursor }
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feed = mapDefined(skeleton.items, (item) =>
ctx.views.feedViewPost(item, hydration),
)
return { feed, cursor: skeleton.cursor }
}
type Context = {
db: Database
actorService: ActorService
feedService: FeedService
graphService: GraphService
hydrator: Hydrator
views: Views
dataplane: DataPlaneClient
}
type Params = QueryParams & {
viewer: string | null
includeSoftDeleted: boolean
}
type Params = QueryParams & { viewer: string | null; includeTakedowns: boolean }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
actor: Actor
items: FeedItem[]
cursor?: string
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,3 +1,4 @@
import { mapDefined } from '@atproto/common'
import {
InvalidRequestError,
UpstreamFailureError,
@ -5,25 +6,27 @@ import {
serverTimingHeader,
} from '@atproto/xrpc-server'
import { ResponseType, XRPCError } from '@atproto/xrpc'
import {
DidDocument,
PoorlyFormattedDidDocumentError,
getFeedGen,
} from '@atproto/identity'
import { AtpAgent, AppBskyFeedGetFeedSkeleton } from '@atproto/api'
import { noUndefinedVals } from '@atproto/common'
import { QueryParams as GetFeedParams } from '../../../../lexicon/types/app/bsky/feed/getFeed'
import { OutputSchema as SkeletonOutput } from '../../../../lexicon/types/app/bsky/feed/getFeedSkeleton'
import { SkeletonFeedPost } from '../../../../lexicon/types/app/bsky/feed/defs'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { createPipeline } from '../../../../pipeline'
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { FeedItem } from '../../../../hydration/feed'
import { GetIdentityByDidResponse } from '../../../../proto/bsky_pb'
import {
Code,
getServiceEndpoint,
isDataplaneError,
unpackIdentityServices,
} from '../../../../data-plane'
export default function (server: Server, ctx: AppContext) {
const getFeed = createPipeline(
@ -35,8 +38,6 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getFeed({
auth: ctx.authVerifier.standardOptionalAnyAud,
handler: async ({ params, auth, req }) => {
const db = ctx.db.getReplica()
const feedService = ctx.services.feed(db)
const viewer = auth.credentials.iss
const headers = noUndefinedVals({
authorization: req.headers['authorization'],
@ -44,13 +45,8 @@ export default function (server: Server, ctx: AppContext) {
})
// @NOTE feed cursors should not be affected by appview swap
const { timerSkele, timerHydr, resHeaders, ...result } = await getFeed(
{ ...params, viewer },
{
db,
feedService,
appCtx: ctx,
headers,
},
{ ...params, viewer, headers },
ctx,
)
return {
@ -66,125 +62,113 @@ export default function (server: Server, ctx: AppContext) {
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
inputs: SkeletonFnInput<Context, Params>,
): Promise<Skeleton> => {
const { ctx, params } = inputs
const timerSkele = new ServerTimer('skele').start()
const feedParams: GetFeedParams = {
feed: params.feed,
limit: params.limit,
cursor: params.cursor,
}
const { feedItems, cursor, resHeaders, ...passthrough } =
await skeletonFromFeedGen(ctx, feedParams)
return {
params,
const {
feedItems: algoItems,
cursor,
feedItems,
resHeaders,
...passthrough
} = await skeletonFromFeedGen(ctx, params)
return {
cursor,
items: algoItems.map(toFeedItem),
timerSkele: timerSkele.stop(),
timerHydr: new ServerTimer('hydr').start(),
resHeaders,
passthrough,
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const hydration = async (
inputs: HydrationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton } = inputs
const timerHydr = new ServerTimer('hydr').start()
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
})
return { ...state, ...hydrated, timerHydr: timerHydr.stop() }
const hydration = await ctx.hydrator.hydrateFeedItems(
skeleton.items,
params.viewer,
)
skeleton.timerHydr = timerHydr.stop()
return hydration
}
const noBlocksOrMutes = (state: HydrationState) => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter((item) => {
if (!viewer) return true
const noBlocksOrMutes = (inputs: RulesFnInput<Context, Params, Skeleton>) => {
const { ctx, skeleton, hydration } = inputs
skeleton.items = skeleton.items.filter((item) => {
const bam = ctx.views.feedItemBlocksAndMutes(item, hydration)
return (
!state.bam.block([viewer, item.postAuthorDid]) &&
!state.bam.block([viewer, item.originatorDid]) &&
!state.bam.mute([viewer, item.postAuthorDid]) &&
!state.bam.mute([viewer, item.originatorDid])
!bam.authorBlocked &&
!bam.authorMuted &&
!bam.originatorBlocked &&
!bam.originatorMuted
)
})
return state
return skeleton
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const { feedItems, cursor, passthrough, params } = state
const feed = feedService.views.formatFeed(feedItems, state, params.viewer)
const presentation = (
inputs: PresentationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton, hydration } = inputs
const feed = mapDefined(skeleton.items, (item) => {
return ctx.views.feedViewPost(item, hydration)
}).slice(0, params.limit)
return {
feed,
cursor,
timerSkele: state.timerSkele,
timerHydr: state.timerHydr,
resHeaders: state.resHeaders,
...passthrough,
cursor: skeleton.cursor,
timerSkele: skeleton.timerSkele,
timerHydr: skeleton.timerHydr,
resHeaders: skeleton.resHeaders,
...skeleton.passthrough,
}
}
type Context = {
db: Database
feedService: FeedService
appCtx: AppContext
type Context = AppContext
type Params = GetFeedParams & {
viewer: string | null
headers: Record<string, string>
}
type Params = GetFeedParams & { viewer: string | null }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
items: FeedItem[]
passthrough: Record<string, unknown> // pass through additional items in feedgen response
resHeaders?: Record<string, string>
cursor?: string
timerSkele: ServerTimer
}
type HydrationState = SkeletonState &
FeedHydrationState & { feedItems: FeedRow[]; timerHydr: ServerTimer }
type AlgoResponse = {
feedItems: FeedRow[]
resHeaders?: Record<string, string>
cursor?: string
timerHydr: ServerTimer
}
const skeletonFromFeedGen = async (
ctx: Context,
params: GetFeedParams,
params: Params,
): Promise<AlgoResponse> => {
const { db, appCtx, headers } = ctx
const { feed } = params
// Resolve and fetch feed skeleton
const found = await db.db
.selectFrom('feed_generator')
.where('uri', '=', feed)
.select('feedDid')
.executeTakeFirst()
if (!found) {
const { feed, headers } = params
const found = await ctx.hydrator.feed.getFeedGens([feed], true)
const feedDid = await found.get(feed)?.record.did
if (!feedDid) {
throw new InvalidRequestError('could not find feed')
}
const feedDid = found.feedDid
let resolved: DidDocument | null
let identity: GetIdentityByDidResponse
try {
resolved = await appCtx.idResolver.did.resolve(feedDid)
identity = await ctx.dataplane.getIdentityByDid({ did: feedDid })
} catch (err) {
if (err instanceof PoorlyFormattedDidDocumentError) {
throw new InvalidRequestError(`invalid did document: ${feedDid}`)
if (isDataplaneError(err, Code.NotFound)) {
throw new InvalidRequestError(`could not resolve identity: ${feedDid}`)
}
throw err
}
if (!resolved) {
throw new InvalidRequestError(`could not resolve did document: ${feedDid}`)
}
const fgEndpoint = getFeedGen(resolved)
const services = unpackIdentityServices(identity.services)
const fgEndpoint = getServiceEndpoint(services, {
id: 'bsky_fg',
type: 'BskyFeedGenerator',
})
if (!fgEndpoint) {
throw new InvalidRequestError(
`invalid feed generator service details in did document: ${feedDid}`,
@ -197,9 +181,16 @@ const skeletonFromFeedGen = async (
let resHeaders: Record<string, string> | undefined = undefined
try {
// @TODO currently passthrough auth headers from pds
const result = await agent.api.app.bsky.feed.getFeedSkeleton(params, {
headers,
})
const result = await agent.api.app.bsky.feed.getFeedSkeleton(
{
feed: params.feed,
limit: params.limit,
cursor: params.cursor,
},
{
headers,
},
)
skeleton = result.data
if (result.headers['content-language']) {
resHeaders = {
@ -225,33 +216,30 @@ const skeletonFromFeedGen = async (
}
const { feed: feedSkele, ...skele } = skeleton
const feedItems = await skeletonToFeedItems(
feedSkele.slice(0, params.limit),
ctx,
)
const feedItems = feedSkele.map((item) => ({
itemUri:
typeof item.reason?.repost === 'string' ? item.reason.repost : item.post,
postUri: item.post,
}))
return { ...skele, resHeaders, feedItems }
}
const skeletonToFeedItems = async (
skeleton: SkeletonFeedPost[],
ctx: Context,
): Promise<FeedRow[]> => {
const { feedService } = ctx
const feedItemUris = skeleton.map(getSkeleFeedItemUri)
const feedItemsRaw = await feedService.getFeedItems(feedItemUris)
const results: FeedRow[] = []
for (const skeleItem of skeleton) {
const feedItem = feedItemsRaw[getSkeleFeedItemUri(skeleItem)]
if (feedItem && feedItem.postUri === skeleItem.post) {
results.push(feedItem)
}
}
return results
export type AlgoResponse = {
feedItems: AlgoResponseItem[]
resHeaders?: Record<string, string>
cursor?: string
}
const getSkeleFeedItemUri = (item: SkeletonFeedPost) => {
return typeof item.reason?.repost === 'string'
? item.reason.repost
: item.post
export type AlgoResponseItem = {
itemUri: string
postUri: string
}
export const toFeedItem = (feedItem: AlgoResponseItem): FeedItem => ({
post: { uri: feedItem.postUri },
repost:
feedItem.itemUri === feedItem.postUri
? undefined
: { uri: feedItem.itemUri },
})

@ -1,11 +1,13 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import {
DidDocument,
PoorlyFormattedDidDocumentError,
getFeedGen,
} from '@atproto/identity'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { GetIdentityByDidResponse } from '../../../../proto/bsky_pb'
import {
Code,
getServiceEndpoint,
isDataplaneError,
unpackIdentityServices,
} from '../../../../data-plane'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getFeedGenerator({
@ -14,47 +16,37 @@ export default function (server: Server, ctx: AppContext) {
const { feed } = params
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const got = await feedService.getFeedGeneratorInfos([feed], viewer)
const feedInfo = got[feed]
const hydration = await ctx.hydrator.hydrateFeedGens([feed], viewer)
const feedInfo = hydration.feedgens?.get(feed)
if (!feedInfo) {
throw new InvalidRequestError('could not find feed')
}
const feedDid = feedInfo.feedDid
let resolved: DidDocument | null
const feedDid = feedInfo.record.did
let identity: GetIdentityByDidResponse
try {
resolved = await ctx.idResolver.did.resolve(feedDid)
identity = await ctx.dataplane.getIdentityByDid({ did: feedDid })
} catch (err) {
if (err instanceof PoorlyFormattedDidDocumentError) {
throw new InvalidRequestError(`invalid did document: ${feedDid}`)
if (isDataplaneError(err, Code.NotFound)) {
throw new InvalidRequestError(
`could not resolve identity: ${feedDid}`,
)
}
throw err
}
if (!resolved) {
throw new InvalidRequestError(
`could not resolve did document: ${feedDid}`,
)
}
const fgEndpoint = getFeedGen(resolved)
const services = unpackIdentityServices(identity.services)
const fgEndpoint = getServiceEndpoint(services, {
id: 'bsky_fg',
type: 'BskyFeedGenerator',
})
if (!fgEndpoint) {
throw new InvalidRequestError(
`invalid feed generator service details in did document: ${feedDid}`,
)
}
const profiles = await actorService.views.profilesBasic(
[feedInfo.creator],
viewer,
)
const feedView = feedService.views.formatFeedGeneratorView(
feedInfo,
profiles,
)
const feedView = ctx.views.feedGenerator(feed, hydration)
if (!feedView) {
throw new InvalidRequestError('could not find feed')
}

@ -1,10 +1,10 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getFeedGenerators'
import AppContext from '../../../../context'
import { FeedGenInfo, FeedService } from '../../../../services/feed'
import { createPipeline, noRules } from '../../../../pipeline'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { Database } from '../../../../db'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
export default function (server: Server, ctx: AppContext) {
const getFeedGenerators = createPipeline(
@ -16,17 +16,8 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getFeedGenerators({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const { feeds } = params
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const view = await getFeedGenerators(
{ feeds, viewer },
{ db, feedService, actorService },
)
const view = await getFeedGenerators({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: view,
@ -35,46 +26,42 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (params: Params, ctx: Context) => {
const { feedService } = ctx
const genInfos = await feedService.getFeedGeneratorInfos(
params.feeds,
params.viewer,
)
const skeleton = async (inputs: { params: Params }): Promise<Skeleton> => {
return {
params,
generators: Object.values(genInfos),
feedUris: inputs.params.feeds,
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { actorService } = ctx
const profiles = await actorService.views.profilesBasic(
state.generators.map((gen) => gen.creator),
state.params.viewer,
)
return {
...state,
profiles,
}
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return await ctx.hydrator.hydrateFeedGens(skeleton.feedUris, params.viewer)
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const feeds = mapDefined(state.generators, (gen) =>
feedService.views.formatFeedGeneratorView(gen, state.profiles),
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feeds = mapDefined(skeleton.feedUris, (uri) =>
ctx.views.feedGenerator(uri, hydration),
)
return { feeds }
return {
feeds,
}
}
type Context = {
db: Database
feedService: FeedService
actorService: ActorService
hydrator: Hydrator
views: Views
}
type Params = { viewer: string | null; feeds: string[] }
type Params = QueryParams & { viewer: string | null }
type SkeletonState = { params: Params; generators: FeedGenInfo[] }
type HydrationState = SkeletonState & { profiles: ActorInfoMap }
type Skeleton = {
feedUris: string[]
}

@ -1,29 +1,22 @@
import { mapDefined } from '@atproto/common'
import { normalizeDatetimeAlways } from '@atproto/syntax'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getLikes'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { notSoftDeletedClause } from '../../../../db/util'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { Actor } from '../../../../db/tables/actor'
import { Database } from '../../../../db'
import { createPipeline } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { parseString } from '../../../../hydration/util'
import { creatorFromUri } from '../../../../views/util'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getLikes = createPipeline(skeleton, hydration, noBlocks, presentation)
server.app.bsky.feed.getLikes({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const viewer = auth.credentials.iss
const result = await getLikes(
{ ...params, viewer },
{ db, actorService, graphService },
)
const result = await getLikes({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
@ -33,99 +26,86 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db } = ctx
const { uri, cid, limit, cursor } = params
const { ref } = db.db.dynamic
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, likes: [] }
const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
if (clearlyBadCursor(params.cursor)) {
return { likes: [] }
}
let builder = db.db
.selectFrom('like')
.where('like.subject', '=', uri)
.innerJoin('actor as creator', 'creator.did', 'like.creator')
.where(notSoftDeletedClause(ref('creator')))
.selectAll('creator')
.select([
'like.cid as cid',
'like.createdAt as createdAt',
'like.indexedAt as indexedAt',
'like.sortAt as sortAt',
])
if (cid) {
builder = builder.where('like.subjectCid', '=', cid)
}
const keyset = new TimeCidKeyset(ref('like.sortAt'), ref('like.cid'))
builder = paginate(builder, {
limit,
cursor,
keyset,
const likesRes = await ctx.hydrator.dataplane.getLikesBySubject({
subject: { uri: params.uri, cid: params.cid },
cursor: params.cursor,
limit: params.limit,
})
const likes = await builder.execute()
return { params, likes, cursor: keyset.packFromResult(likes) }
return {
likes: likesRes.uris,
cursor: parseString(likesRes.cursor),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService } = ctx
const { params, likes } = state
const { viewer } = params
const [actors, bam] = await Promise.all([
actorService.views.profiles(likes, viewer),
graphService.getBlockAndMuteState(
viewer ? likes.map((like) => [viewer, like.did]) : [],
),
])
return { ...state, bam, actors }
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return await ctx.hydrator.hydrateLikes(skeleton.likes, params.viewer)
}
const noBlocks = (state: HydrationState) => {
const { viewer } = state.params
if (!viewer) return state
state.likes = state.likes.filter(
(item) => !state.bam.block([viewer, item.did]),
)
return state
const noBlocks = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
skeleton.likes = skeleton.likes.filter((uri) => {
const creator = creatorFromUri(uri)
return !ctx.views.viewerBlockExists(creator, hydration)
})
return skeleton
}
const presentation = (state: HydrationState) => {
const { params, likes, actors, cursor } = state
const { uri, cid } = params
const likesView = mapDefined(likes, (like) =>
actors[like.did]
? {
createdAt: like.createdAt,
indexedAt: like.indexedAt,
actor: actors[like.did],
}
: undefined,
)
return { likes: likesView, cursor, uri, cid }
const presentation = (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, params, skeleton, hydration } = inputs
const likeViews = mapDefined(skeleton.likes, (uri) => {
const like = hydration.likes?.get(uri)
if (!like || !like.record) {
return
}
const creatorDid = creatorFromUri(uri)
const actor = ctx.views.profile(creatorDid, hydration)
if (!actor) {
return
}
return {
actor,
createdAt: normalizeDatetimeAlways(like.record.createdAt),
indexedAt: like.sortedAt.toISOString(),
}
})
return {
likes: likeViews,
cursor: skeleton.cursor,
uri: params.uri,
cid: params.cid,
}
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
likes: (Actor & { createdAt: string })[]
type Skeleton = {
likes: string[]
cursor?: string
}
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
}

@ -1,18 +1,14 @@
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getListFeed'
import { FeedKeyset, getFeedDateThreshold } from '../util/feed'
import { paginate } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { setRepoRev } from '../../../util'
import { Database } from '../../../../db'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { ActorService } from '../../../../services/actor'
import { GraphService } from '../../../../services/graph'
import { clearlyBadCursor, setRepoRev } from '../../../util'
import { createPipeline } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { mapDefined } from '@atproto/common'
import { parseString } from '../../../../hydration/util'
import { FeedItem } from '../../../../hydration/feed'
export default function (server: Server, ctx: AppContext) {
const getListFeed = createPipeline(
@ -25,19 +21,10 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth, res }) => {
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const feedService = ctx.services.feed(db)
const graphService = ctx.services.graph(db)
const [result, repoRev] = await Promise.all([
getListFeed(
{ ...params, viewer },
{ db, actorService, feedService, graphService },
),
actorService.getRepoRev(viewer),
])
const result = await getListFeed({ ...params, viewer }, ctx)
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
@ -48,84 +35,78 @@ export default function (server: Server, ctx: AppContext) {
})
}
export const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { list, cursor, limit } = params
const { db } = ctx
const { ref } = db.db.dynamic
if (FeedKeyset.clearlyBad(cursor)) {
return { params, feedItems: [] }
export const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
if (clearlyBadCursor(params.cursor)) {
return { items: [] }
}
const keyset = new FeedKeyset(ref('post.sortAt'), ref('post.cid'))
const sortFrom = keyset.unpack(cursor)?.primary
let builder = ctx.feedService
.selectPostQb()
.innerJoin('list_item', 'list_item.subjectDid', 'post.creator')
.where('list_item.listUri', '=', list)
.where('post.sortAt', '>', getFeedDateThreshold(sortFrom, 3))
builder = paginate(builder, {
limit,
cursor,
keyset,
tryIndex: true,
const res = await ctx.dataplane.getListFeed({
listUri: params.list,
limit: params.limit,
cursor: params.cursor,
})
const feedItems = await builder.execute()
return {
params,
feedItems,
cursor: keyset.packFromResult(feedItems),
items: res.items.map((item) => ({
post: { uri: item.uri, cid: item.cid || undefined },
repost: item.repost
? { uri: item.repost, cid: item.repostCid || undefined }
: undefined,
})),
cursor: parseString(res.cursor),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}): Promise<HydrationState> => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydrateFeedItems(skeleton.items, params.viewer)
}
const noBlocksOrMutes = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}): Skeleton => {
const { ctx, skeleton, hydration } = inputs
skeleton.items = skeleton.items.filter((item) => {
const bam = ctx.views.feedItemBlocksAndMutes(item, hydration)
return (
!bam.authorBlocked &&
!bam.authorMuted &&
!bam.originatorBlocked &&
!bam.originatorMuted
)
})
return { ...state, ...hydrated }
return skeleton
}
const noBlocksOrMutes = (state: HydrationState) => {
const { viewer } = state.params
if (!viewer) return state
state.feedItems = state.feedItems.filter(
(item) =>
!state.bam.block([viewer, item.postAuthorDid]) &&
!state.bam.mute([viewer, item.postAuthorDid]),
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feed = mapDefined(skeleton.items, (item) =>
ctx.views.feedViewPost(item, hydration),
)
return state
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const { feedItems, cursor, params } = state
const feed = feedService.views.formatFeed(feedItems, state, params.viewer)
return { feed, cursor }
return { feed, cursor: skeleton.cursor }
}
type Context = {
db: Database
actorService: ActorService
feedService: FeedService
graphService: GraphService
hydrator: Hydrator
views: Views
dataplane: DataPlaneClient
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
items: FeedItem[]
cursor?: string
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,27 +1,22 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { AtUri } from '@atproto/syntax'
import { Server } from '../../../../lexicon'
import { isNotFoundPost } from '../../../../lexicon/types/app/bsky/feed/defs'
import {
BlockedPost,
NotFoundPost,
ThreadViewPost,
isNotFoundPost,
} from '../../../../lexicon/types/app/bsky/feed/defs'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getPostThread'
QueryParams,
OutputSchema,
} from '../../../../lexicon/types/app/bsky/feed/getPostThread'
import AppContext from '../../../../context'
import {
FeedService,
FeedRow,
FeedHydrationState,
} from '../../../../services/feed'
import {
getAncestorsAndSelfQb,
getDescendentsQb,
} from '../../../../services/util/post'
import { Database } from '../../../../db'
import { setRepoRev } from '../../../util'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { createPipeline, noRules } from '../../../../pipeline'
import {
HydrationFnInput,
PresentationFnInput,
SkeletonFnInput,
createPipeline,
noRules,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient, isDataplaneError, Code } from '../../../../data-plane'
export default function (server: Server, ctx: AppContext) {
const getPostThread = createPipeline(
@ -34,64 +29,69 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.optionalStandardOrRole,
handler: async ({ params, auth, res }) => {
const { viewer } = ctx.authVerifier.parseCreds(auth)
const db = ctx.db.getReplica('thread')
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const [result, repoRev] = await Promise.allSettled([
getPostThread({ ...params, viewer }, { db, feedService, actorService }),
actorService.getRepoRev(viewer),
])
let result: OutputSchema
try {
result = await getPostThread({ ...params, viewer }, ctx)
} catch (err) {
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
throw err
}
if (repoRev.status === 'fulfilled') {
setRepoRev(res, repoRev.value)
}
if (result.status === 'rejected') {
throw result.reason
}
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
encoding: 'application/json',
body: result.value,
body: result,
}
},
})
}
const skeleton = async (params: Params, ctx: Context) => {
const threadData = await getThreadData(params, ctx)
if (!threadData) {
throw new InvalidRequestError(`Post not found: ${params.uri}`, 'NotFound')
const skeleton = async (inputs: SkeletonFnInput<Context, Params>) => {
const { ctx, params } = inputs
try {
const res = await ctx.dataplane.getThread({
postUri: params.uri,
above: params.parentHeight,
below: params.depth,
})
return {
anchor: params.uri,
uris: res.uris,
}
} catch (err) {
if (isDataplaneError(err, Code.NotFound)) {
return {
anchor: params.uri,
uris: [],
}
} else {
throw err
}
}
return { params, threadData }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { feedService } = ctx
const {
threadData,
params: { viewer },
} = state
const relevant = getRelevantIds(threadData)
const hydrated = await feedService.feedHydration({ ...relevant, viewer })
return { ...state, ...hydrated }
const hydration = async (
inputs: HydrationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydrateThreadPosts(
skeleton.uris.map((uri) => ({ uri })),
params.viewer,
)
}
const presentation = (state: HydrationState, ctx: Context) => {
const { params, profiles } = state
const { actorService } = ctx
const actors = actorService.views.profileBasicPresentation(
Object.keys(profiles),
state,
params.viewer,
)
const thread = composeThread(
state.threadData,
actors,
state,
ctx,
params.viewer,
)
const presentation = (
inputs: PresentationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton, hydration } = inputs
const thread = ctx.views.thread(skeleton, hydration, {
height: params.parentHeight,
depth: params.depth,
})
if (isNotFoundPost(thread)) {
// @TODO technically this could be returned as a NotFoundPost based on lexicon
throw new InvalidRequestError(`Post not found: ${params.uri}`, 'NotFound')
@ -99,234 +99,15 @@ const presentation = (state: HydrationState, ctx: Context) => {
return { thread }
}
const composeThread = (
threadData: PostThread,
actors: ActorInfoMap,
state: HydrationState,
ctx: Context,
viewer: string | null,
) => {
const { feedService } = ctx
const { posts, threadgates, embeds, blocks, labels, lists } = state
const post = feedService.views.formatPostView(
threadData.post.postUri,
actors,
posts,
threadgates,
embeds,
labels,
lists,
viewer,
)
// replies that are invalid due to reply-gating:
// a. may appear as the anchor post, but without any parent or replies.
// b. may not appear anywhere else in the thread.
const isAnchorPost = state.threadData.post.uri === threadData.post.postUri
const info = posts[threadData.post.postUri]
// @TODO re-enable invalidReplyRoot check
// const badReply = !!info?.invalidReplyRoot || !!info?.violatesThreadGate
const badReply = !!info?.violatesThreadGate
const violatesBlock = (post && blocks[post.uri]?.reply) ?? false
const omitBadReply = !isAnchorPost && (badReply || violatesBlock)
if (!post || omitBadReply) {
return {
$type: 'app.bsky.feed.defs#notFoundPost',
uri: threadData.post.postUri,
notFound: true,
}
}
if (post.author.viewer?.blocking || post.author.viewer?.blockedBy) {
return {
$type: 'app.bsky.feed.defs#blockedPost',
uri: threadData.post.postUri,
blocked: true,
author: {
did: post.author.did,
viewer: post.author.viewer
? {
blockedBy: post.author.viewer?.blockedBy,
blocking: post.author.viewer?.blocking,
}
: undefined,
},
}
}
let parent
if (threadData.parent && !badReply && !violatesBlock) {
if (threadData.parent instanceof ParentNotFoundError) {
parent = {
$type: 'app.bsky.feed.defs#notFoundPost',
uri: threadData.parent.uri,
notFound: true,
}
} else {
parent = composeThread(threadData.parent, actors, state, ctx, viewer)
}
}
let replies: (ThreadViewPost | NotFoundPost | BlockedPost)[] | undefined
if (threadData.replies && !badReply) {
replies = threadData.replies.flatMap((reply) => {
const thread = composeThread(reply, actors, state, ctx, viewer)
// e.g. don't bother including #postNotFound reply placeholders for takedowns. either way matches api contract.
const skip = []
return isNotFoundPost(thread) ? skip : thread
})
}
return {
$type: 'app.bsky.feed.defs#threadViewPost',
post,
parent,
replies,
}
}
const getRelevantIds = (
thread: PostThread,
): { dids: Set<string>; uris: Set<string> } => {
const dids = new Set<string>()
const uris = new Set<string>()
if (thread.parent && !(thread.parent instanceof ParentNotFoundError)) {
const fromParent = getRelevantIds(thread.parent)
fromParent.dids.forEach((did) => dids.add(did))
fromParent.uris.forEach((uri) => uris.add(uri))
}
if (thread.replies) {
for (const reply of thread.replies) {
const fromChild = getRelevantIds(reply)
fromChild.dids.forEach((did) => dids.add(did))
fromChild.uris.forEach((uri) => uris.add(uri))
}
}
dids.add(thread.post.postAuthorDid)
uris.add(thread.post.postUri)
if (thread.post.replyRoot) {
// ensure root is included for checking interactions
uris.add(thread.post.replyRoot)
dids.add(new AtUri(thread.post.replyRoot).hostname)
}
return { dids, uris }
}
const getThreadData = async (
params: Params,
ctx: Context,
): Promise<PostThread | null> => {
const { db, feedService } = ctx
const { uri, depth, parentHeight } = params
const [parents, children] = await Promise.all([
getAncestorsAndSelfQb(db.db, { uri, parentHeight })
.selectFrom('ancestor')
.innerJoin(
feedService.selectPostQb().as('post'),
'post.uri',
'ancestor.uri',
)
.selectAll('post')
.execute(),
getDescendentsQb(db.db, { uri, depth })
.selectFrom('descendent')
.innerJoin(
feedService.selectPostQb().as('post'),
'post.uri',
'descendent.uri',
)
.selectAll('post')
.orderBy('sortAt', 'desc')
.execute(),
])
// prevent self-referential loops
const includedPosts = new Set<string>([uri])
const parentsByUri = parents.reduce((acc, post) => {
return Object.assign(acc, { [post.uri]: post })
}, {} as Record<string, FeedRow>)
const childrenByParentUri = children.reduce((acc, child) => {
if (!child.replyParent) return acc
if (includedPosts.has(child.uri)) return acc
includedPosts.add(child.uri)
acc[child.replyParent] ??= []
acc[child.replyParent].push(child)
return acc
}, {} as Record<string, FeedRow[]>)
const post = parentsByUri[uri]
if (!post) return null
return {
post,
parent: post.replyParent
? getParentData(
parentsByUri,
includedPosts,
post.replyParent,
parentHeight,
)
: undefined,
replies: getChildrenData(childrenByParentUri, uri, depth),
}
}
const getParentData = (
postsByUri: Record<string, FeedRow>,
includedPosts: Set<string>,
uri: string,
depth: number,
): PostThread | ParentNotFoundError | undefined => {
if (depth < 1) return undefined
if (includedPosts.has(uri)) return undefined
includedPosts.add(uri)
const post = postsByUri[uri]
if (!post) return new ParentNotFoundError(uri)
return {
post,
parent: post.replyParent
? getParentData(postsByUri, includedPosts, post.replyParent, depth - 1)
: undefined,
replies: [],
}
}
const getChildrenData = (
childrenByParentUri: Record<string, FeedRow[]>,
uri: string,
depth: number,
): PostThread[] | undefined => {
if (depth === 0) return undefined
const children = childrenByParentUri[uri] ?? []
return children.map((row) => ({
post: row,
replies: getChildrenData(childrenByParentUri, row.postUri, depth - 1),
}))
}
class ParentNotFoundError extends Error {
constructor(public uri: string) {
super(`Parent not found: ${uri}`)
}
}
type PostThread = {
post: FeedRow
parent?: PostThread | ParentNotFoundError
replies?: PostThread[]
}
type Context = {
db: Database
feedService: FeedService
actorService: ActorService
dataplane: DataPlaneClient
hydrator: Hydrator
views: Views
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
threadData: PostThread
type Skeleton = {
anchor: string
uris: string[]
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,30 +1,19 @@
import { dedupeStrs } from '@atproto/common'
import { dedupeStrs, mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getPosts'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { createPipeline } from '../../../../pipeline'
import { ActorService } from '../../../../services/actor'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { creatorFromUri } from '../../../../views/util'
export default function (server: Server, ctx: AppContext) {
const getPosts = createPipeline(skeleton, hydration, noBlocks, presentation)
server.app.bsky.feed.getPosts({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const viewer = auth.credentials.iss
const results = await getPosts(
{ ...params, viewer },
{ db, feedService, actorService },
)
const results = await getPosts({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
@ -34,68 +23,55 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (params: Params, ctx: Context) => {
const deduped = dedupeStrs(params.uris)
const feedItems = await ctx.feedService.postUrisToFeedItems(deduped)
return { params, feedItems }
const skeleton = async (inputs: { params: Params }) => {
return { posts: dedupeStrs(inputs.params.uris) }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
})
return { ...state, ...hydrated }
}
const noBlocks = (state: HydrationState) => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter((item) => {
if (!viewer) return true
return !state.bam.block([viewer, item.postAuthorDid])
})
return state
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService, actorService } = ctx
const { feedItems, profiles, params } = state
const SKIP = []
const actors = actorService.views.profileBasicPresentation(
Object.keys(profiles),
state,
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydratePosts(
skeleton.posts.map((uri) => ({ uri })),
params.viewer,
)
const postViews = feedItems.flatMap((item) => {
const postView = feedService.views.formatPostView(
item.postUri,
actors,
state.posts,
state.threadgates,
state.embeds,
state.labels,
state.lists,
params.viewer,
)
return postView ?? SKIP
}
const noBlocks = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
skeleton.posts = skeleton.posts.filter((uri) => {
const creator = creatorFromUri(uri)
return !ctx.views.viewerBlockExists(creator, hydration)
})
return { posts: postViews }
return skeleton
}
const presentation = (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const posts = mapDefined(skeleton.posts, (uri) =>
ctx.views.post(uri, hydration),
)
return { posts }
}
type Context = {
db: Database
feedService: FeedService
actorService: ActorService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
posts: string[]
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,14 +1,13 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getRepostedBy'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { notSoftDeletedClause } from '../../../../db/util'
import { Database } from '../../../../db'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { Actor } from '../../../../db/tables/actor'
import { createPipeline } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { parseString } from '../../../../hydration/util'
import { creatorFromUri } from '../../../../views/util'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getRepostedBy = createPipeline(
@ -20,15 +19,8 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getRepostedBy({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const viewer = auth.credentials.iss
const result = await getRepostedBy(
{ ...params, viewer },
{ db, actorService, graphService },
)
const result = await getRepostedBy({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
@ -38,85 +30,78 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db } = ctx
const { limit, cursor, uri, cid } = params
const { ref } = db.db.dynamic
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, repostedBy: [] }
const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
if (clearlyBadCursor(params.cursor)) {
return { reposts: [] }
}
let builder = db.db
.selectFrom('repost')
.where('repost.subject', '=', uri)
.innerJoin('actor as creator', 'creator.did', 'repost.creator')
.where(notSoftDeletedClause(ref('creator')))
.selectAll('creator')
.select(['repost.cid as cid', 'repost.sortAt as sortAt'])
if (cid) {
builder = builder.where('repost.subjectCid', '=', cid)
}
const keyset = new TimeCidKeyset(ref('repost.sortAt'), ref('repost.cid'))
builder = paginate(builder, {
limit,
cursor,
keyset,
const res = await ctx.hydrator.dataplane.getRepostsBySubject({
subject: { uri: params.uri, cid: params.cid },
cursor: params.cursor,
limit: params.limit,
})
const repostedBy = await builder.execute()
return { params, repostedBy, cursor: keyset.packFromResult(repostedBy) }
return {
reposts: res.uris,
cursor: parseString(res.cursor),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService } = ctx
const { params, repostedBy } = state
const { viewer } = params
const [actors, bam] = await Promise.all([
actorService.views.profiles(repostedBy, viewer),
graphService.getBlockAndMuteState(
viewer ? repostedBy.map((item) => [viewer, item.did]) : [],
),
])
return { ...state, bam, actors }
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}) => {
const { ctx, params, skeleton } = inputs
return await ctx.hydrator.hydrateReposts(skeleton.reposts, params.viewer)
}
const noBlocks = (state: HydrationState) => {
const { viewer } = state.params
if (!viewer) return state
state.repostedBy = state.repostedBy.filter(
(item) => !state.bam.block([viewer, item.did]),
)
return state
const noBlocks = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
skeleton.reposts = skeleton.reposts.filter((uri) => {
const creator = creatorFromUri(uri)
return !ctx.views.viewerBlockExists(creator, hydration)
})
return skeleton
}
const presentation = (state: HydrationState) => {
const { params, repostedBy, actors, cursor } = state
const { uri, cid } = params
const repostedByView = mapDefined(repostedBy, (item) => actors[item.did])
return { repostedBy: repostedByView, cursor, uri, cid }
const presentation = (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, params, skeleton, hydration } = inputs
const repostViews = mapDefined(skeleton.reposts, (uri) => {
const repost = hydration.reposts?.get(uri)
if (!repost?.record) {
return
}
const creatorDid = creatorFromUri(uri)
return ctx.views.profile(creatorDid, hydration)
})
return {
repostedBy: repostViews,
cursor: skeleton.cursor,
uri: params.uri,
cid: params.cid,
}
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
repostedBy: Actor[]
type Skeleton = {
reposts: string[]
cursor?: string
}
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
}

@ -1,37 +1,31 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { parseString } from '../../../../hydration/util'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.feed.getSuggestedFeeds({
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth }) => {
// @NOTE ignores cursor, doesn't matter for appview swap
handler: async ({ auth, params }) => {
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const feedsRes = await db.db
.selectFrom('suggested_feed')
.orderBy('suggested_feed.order', 'asc')
.selectAll()
.execute()
const genInfos = await feedService.getFeedGeneratorInfos(
feedsRes.map((r) => r.uri),
viewer,
)
const genList = feedsRes.map((r) => genInfos[r.uri]).filter(Boolean)
const creators = genList.map((gen) => gen.creator)
const profiles = await actorService.views.profilesBasic(creators, viewer)
const feedViews = mapDefined(genList, (gen) =>
feedService.views.formatFeedGeneratorView(gen, profiles),
// @NOTE no need to coordinate the cursor for appview swap, as v1 doesn't use the cursor
const suggestedRes = await ctx.dataplane.getSuggestedFeeds({
actorDid: viewer ?? undefined,
limit: params.limit,
cursor: params.cursor,
})
const uris = suggestedRes.uris
const hydration = await ctx.hydrator.hydrateFeedGens(uris, viewer)
const feedViews = mapDefined(uris, (uri) =>
ctx.views.feedGenerator(uri, hydration),
)
return {
encoding: 'application/json',
body: {
feeds: feedViews,
cursor: parseString(suggestedRes.cursor),
},
}
},

@ -1,18 +1,14 @@
import { sql } from 'kysely'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { FeedAlgorithm, FeedKeyset, getFeedDateThreshold } from '../util/feed'
import { paginate } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/getTimeline'
import { setRepoRev } from '../../../util'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { clearlyBadCursor, setRepoRev } from '../../../util'
import { createPipeline } from '../../../../pipeline'
import { HydrationState, Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
import { mapDefined } from '@atproto/common'
import { FeedItem } from '../../../../hydration/feed'
export default function (server: Server, ctx: AppContext) {
const getTimeline = createPipeline(
@ -25,15 +21,10 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.standard,
handler: async ({ params, auth, res }) => {
const viewer = auth.credentials.iss
const db = ctx.db.getReplica('timeline')
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const [result, repoRev] = await Promise.all([
getTimeline({ ...params, viewer }, { db, feedService }),
actorService.getRepoRev(viewer),
])
const result = await getTimeline({ ...params, viewer }, ctx)
const repoRev = await ctx.hydrator.actor.getRepoRevSafe(viewer)
setRepoRev(res, repoRev)
return {
@ -44,181 +35,78 @@ export default function (server: Server, ctx: AppContext) {
})
}
export const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { cursor, limit, algorithm, viewer } = params
const { db } = ctx
const { ref } = db.db.dynamic
if (algorithm && algorithm !== FeedAlgorithm.ReverseChronological) {
throw new InvalidRequestError(`Unsupported algorithm: ${algorithm}`)
export const skeleton = async (inputs: {
ctx: Context
params: Params
}): Promise<Skeleton> => {
const { ctx, params } = inputs
if (clearlyBadCursor(params.cursor)) {
return { items: [] }
}
if (limit === 1 && !cursor) {
// special case for limit=1, which is often used to check if there are new items at the top of the timeline.
return skeletonLimit1(params, ctx)
}
if (FeedKeyset.clearlyBad(cursor)) {
return { params, feedItems: [] }
}
const keyset = new FeedKeyset(ref('feed_item.sortAt'), ref('feed_item.cid'))
const sortFrom = keyset.unpack(cursor)?.primary
let followQb = db.db
.selectFrom('feed_item')
.innerJoin('follow', 'follow.subjectDid', 'feed_item.originatorDid')
.where('follow.creator', '=', viewer)
.innerJoin('post', 'post.uri', 'feed_item.postUri')
.where('feed_item.sortAt', '>', getFeedDateThreshold(sortFrom, 2))
.selectAll('feed_item')
.select([
'post.replyRoot',
'post.replyParent',
'post.creator as postAuthorDid',
])
followQb = paginate(followQb, {
limit,
cursor,
keyset,
tryIndex: true,
const res = await ctx.dataplane.getTimeline({
actorDid: params.viewer,
limit: params.limit,
cursor: params.cursor,
})
let selfQb = db.db
.selectFrom('feed_item')
.innerJoin('post', 'post.uri', 'feed_item.postUri')
.where('feed_item.originatorDid', '=', viewer)
.where('feed_item.sortAt', '>', getFeedDateThreshold(sortFrom, 2))
.selectAll('feed_item')
.select([
'post.replyRoot',
'post.replyParent',
'post.creator as postAuthorDid',
])
selfQb = paginate(selfQb, {
limit: Math.min(limit, 10),
cursor,
keyset,
tryIndex: true,
})
const [followRes, selfRes] = await Promise.all([
followQb.execute(),
selfQb.execute(),
])
const feedItems: FeedRow[] = [...followRes, ...selfRes]
.sort((a, b) => {
if (a.sortAt > b.sortAt) return -1
if (a.sortAt < b.sortAt) return 1
return a.cid > b.cid ? -1 : 1
})
.slice(0, limit)
return {
params,
feedItems,
cursor: keyset.packFromResult(feedItems),
items: res.items.map((item) => ({
post: { uri: item.uri, cid: item.cid || undefined },
repost: item.repost
? { uri: item.repost, cid: item.repostCid || undefined }
: undefined,
})),
cursor: parseString(res.cursor),
}
}
// The limit=1 case is used commonly to check if there are new items at the top of the timeline.
// Since it's so common, it's optimized here. The most common strategy that postgres takes to
// build a timeline is to grab all recent content from each of the user's follow, then paginate it.
// The downside here is that it requires grabbing all recent content from all follows, even if you
// only want a single result. The approach here instead takes the single most recent post from
// each of the user's follows, then sorts only those and takes the top item.
const skeletonLimit1 = async (params: Params, ctx: Context) => {
const { viewer } = params
const { db } = ctx
const { ref } = db.db.dynamic
const creatorsQb = db.db
.selectFrom('follow')
.where('creator', '=', viewer)
.select('subjectDid as did')
.unionAll(sql`select ${viewer} as did`)
const feedItemsQb = db.db
.selectFrom(creatorsQb.as('creator'))
.innerJoinLateral(
(eb) => {
const keyset = new FeedKeyset(
ref('feed_item.sortAt'),
ref('feed_item.cid'),
)
const creatorFeedItemQb = eb
.selectFrom('feed_item')
.innerJoin('post', 'post.uri', 'feed_item.postUri')
.whereRef('feed_item.originatorDid', '=', 'creator.did')
.where('feed_item.sortAt', '>', getFeedDateThreshold(undefined, 2))
.selectAll('feed_item')
.select([
'post.replyRoot',
'post.replyParent',
'post.creator as postAuthorDid',
])
return paginate(creatorFeedItemQb, { limit: 1, keyset }).as('result')
},
(join) => join.onTrue(),
const hydration = async (inputs: {
ctx: Context
params: Params
skeleton: Skeleton
}): Promise<HydrationState> => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydrateFeedItems(skeleton.items, params.viewer)
}
const noBlocksOrMutes = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}): Skeleton => {
const { ctx, skeleton, hydration } = inputs
skeleton.items = skeleton.items.filter((item) => {
const bam = ctx.views.feedItemBlocksAndMutes(item, hydration)
return (
!bam.authorBlocked &&
!bam.authorMuted &&
!bam.originatorBlocked &&
!bam.originatorMuted
)
.selectAll('result')
const keyset = new FeedKeyset(ref('result.sortAt'), ref('result.cid'))
const feedItems = await paginate(feedItemsQb, { limit: 1, keyset }).execute()
return {
params,
feedItems,
cursor: keyset.packFromResult(feedItems),
}
}
const hydration = async (
state: SkeletonState,
ctx: Context,
): Promise<HydrationState> => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
})
return { ...state, ...hydrated }
return skeleton
}
const noBlocksOrMutes = (state: HydrationState): HydrationState => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter(
(item) =>
!state.bam.block([viewer, item.postAuthorDid]) &&
!state.bam.block([viewer, item.originatorDid]) &&
!state.bam.mute([viewer, item.postAuthorDid]) &&
!state.bam.mute([viewer, item.originatorDid]),
const presentation = (inputs: {
ctx: Context
skeleton: Skeleton
hydration: HydrationState
}) => {
const { ctx, skeleton, hydration } = inputs
const feed = mapDefined(skeleton.items, (item) =>
ctx.views.feedViewPost(item, hydration),
)
return state
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService } = ctx
const { feedItems, cursor, params } = state
const feed = feedService.views.formatFeed(feedItems, state, params.viewer)
return { feed, cursor }
return { feed, cursor: skeleton.cursor }
}
type Context = {
db: Database
feedService: FeedService
hydrator: Hydrator
views: Views
dataplane: DataPlaneClient
}
type Params = QueryParams & { viewer: string }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
items: FeedItem[]
cursor?: string
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,17 +1,20 @@
import AppContext from '../../../../context'
import { Server } from '../../../../lexicon'
import { InvalidRequestError } from '@atproto/xrpc-server'
import AtpAgent from '@atproto/api'
import { mapDefined } from '@atproto/common'
import { QueryParams } from '../../../../lexicon/types/app/bsky/feed/searchPosts'
import { Database } from '../../../../db'
import {
FeedHydrationState,
FeedRow,
FeedService,
} from '../../../../services/feed'
import { ActorService } from '../../../../services/actor'
import { createPipeline } from '../../../../pipeline'
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { DataPlaneClient } from '../../../../data-plane'
import { parseString } from '../../../../hydration/util'
import { creatorFromUri } from '../../../../views/util'
export default function (server: Server, ctx: AppContext) {
const searchPosts = createPipeline(
@ -24,19 +27,7 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth, params }) => {
const viewer = auth.credentials.iss
const db = ctx.db.getReplica('search')
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
const searchAgent = ctx.searchAgent
if (!searchAgent) {
throw new InvalidRequestError('Search not available')
}
const results = await searchPosts(
{ ...params, viewer },
{ db, feedService, actorService, searchAgent },
)
const results = await searchPosts({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: results,
@ -45,87 +36,78 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
// @NOTE cursors wont change on appview swap
const res = await ctx.searchAgent.api.app.bsky.unspecced.searchPostsSkeleton({
q: params.q,
cursor: params.cursor,
const skeleton = async (inputs: SkeletonFnInput<Context, Params>) => {
const { ctx, params } = inputs
if (ctx.searchAgent) {
// @NOTE cursors wont change on appview swap
const { data: res } =
await ctx.searchAgent.api.app.bsky.unspecced.searchPostsSkeleton({
q: params.q,
cursor: params.cursor,
limit: params.limit,
})
return {
posts: res.posts.map(({ uri }) => uri),
cursor: parseString(res.cursor),
}
}
const res = await ctx.dataplane.searchPosts({
term: params.q,
limit: params.limit,
cursor: params.cursor,
})
const postUris = res.data.posts.map((a) => a.uri)
const feedItems = await ctx.feedService.postUrisToFeedItems(postUris)
return {
params,
feedItems,
cursor: res.data.cursor,
hitsTotal: res.data.hitsTotal,
posts: res.uris,
cursor: parseString(res.cursor),
}
}
const hydration = async (
state: SkeletonState,
ctx: Context,
): Promise<HydrationState> => {
const { feedService } = ctx
const { params, feedItems } = state
const refs = feedService.feedItemRefs(feedItems)
const hydrated = await feedService.feedHydration({
...refs,
viewer: params.viewer,
})
return { ...state, ...hydrated }
}
const noBlocks = (state: HydrationState): HydrationState => {
const { viewer } = state.params
state.feedItems = state.feedItems.filter((item) => {
if (!viewer) return true
return !state.bam.block([viewer, item.postAuthorDid])
})
return state
}
const presentation = (state: HydrationState, ctx: Context) => {
const { feedService, actorService } = ctx
const { feedItems, profiles, params } = state
const actors = actorService.views.profileBasicPresentation(
Object.keys(profiles),
state,
inputs: HydrationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, params, skeleton } = inputs
return ctx.hydrator.hydratePosts(
skeleton.posts.map((uri) => ({ uri })),
params.viewer,
)
}
const postViews = mapDefined(feedItems, (item) =>
feedService.views.formatPostView(
item.postUri,
actors,
state.posts,
state.threadgates,
state.embeds,
state.labels,
state.lists,
params.viewer,
),
const noBlocks = (inputs: RulesFnInput<Context, Params, Skeleton>) => {
const { ctx, skeleton, hydration } = inputs
skeleton.posts = skeleton.posts.filter((uri) => {
const creator = creatorFromUri(uri)
return !ctx.views.viewerBlockExists(creator, hydration)
})
return skeleton
}
const presentation = (
inputs: PresentationFnInput<Context, Params, Skeleton>,
) => {
const { ctx, skeleton, hydration } = inputs
const posts = mapDefined(skeleton.posts, (uri) =>
ctx.views.post(uri, hydration),
)
return { posts: postViews, cursor: state.cursor, hitsTotal: state.hitsTotal }
return {
posts,
cursor: skeleton.cursor,
hitsTotal: skeleton.hitsTotal,
}
}
type Context = {
db: Database
feedService: FeedService
actorService: ActorService
searchAgent: AtpAgent
dataplane: DataPlaneClient
hydrator: Hydrator
views: Views
searchAgent?: AtpAgent
}
type Params = QueryParams & { viewer: string | null }
type SkeletonState = {
params: Params
feedItems: FeedRow[]
type Skeleton = {
posts: string[]
hitsTotal?: number
cursor?: string
}
type HydrationState = SkeletonState & FeedHydrationState

@ -1,54 +1,84 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getBlocks'
import AppContext from '../../../../context'
import { notSoftDeletedClause } from '../../../../db/util'
import {
createPipeline,
HydrationFnInput,
noRules,
PresentationFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getBlocks = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.graph.getBlocks({
auth: ctx.authVerifier.standard,
handler: async ({ params, auth }) => {
const { limit, cursor } = params
const requester = auth.credentials.iss
if (TimeCidKeyset.clearlyBad(cursor)) {
return {
encoding: 'application/json',
body: { blocks: [] },
}
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
let blocksReq = db.db
.selectFrom('actor_block')
.where('actor_block.creator', '=', requester)
.innerJoin('actor as subject', 'subject.did', 'actor_block.subjectDid')
.where(notSoftDeletedClause(ref('subject')))
.selectAll('subject')
.select(['actor_block.cid as cid', 'actor_block.sortAt as sortAt'])
const keyset = new TimeCidKeyset(
ref('actor_block.sortAt'),
ref('actor_block.cid'),
)
blocksReq = paginate(blocksReq, {
limit,
cursor,
keyset,
})
const blocksRes = await blocksReq.execute()
const actorService = ctx.services.actor(db)
const blocks = await actorService.views.profilesList(blocksRes, requester)
const viewer = auth.credentials.iss
const result = await getBlocks({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
blocks,
cursor: keyset.packFromResult(blocksRes),
},
body: result,
}
},
})
}
const skeleton = async (input: SkeletonFnInput<Context, Params>) => {
const { params, ctx } = input
if (clearlyBadCursor(params.cursor)) {
return { blockedDids: [] }
}
const { blockUris, cursor } = await ctx.hydrator.dataplane.getBlocks({
actorDid: params.viewer,
cursor: params.cursor,
limit: params.limit,
})
const blocks = await ctx.hydrator.graph.getBlocks(blockUris)
const blockedDids = mapDefined(
blockUris,
(uri) => blocks.get(uri)?.record.subject,
)
return {
blockedDids,
cursor: cursor || undefined,
}
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { blockedDids } = skeleton
return ctx.hydrator.hydrateProfiles(blockedDids, viewer)
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, hydration, skeleton } = input
const { blockedDids, cursor } = skeleton
const blocks = mapDefined(blockedDids, (did) => {
return ctx.views.profile(did, hydration)
})
return { blocks, cursor }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string
}
type SkeletonState = {
blockedDids: string[]
cursor?: string
}

@ -3,32 +3,33 @@ import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getFollowers'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { notSoftDeletedClause } from '../../../../db/util'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { Actor } from '../../../../db/tables/actor'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { createPipeline } from '../../../../pipeline'
import {
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { didFromUri } from '../../../../hydration/util'
import { Hydrator, mergeStates } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getFollowers = createPipeline(
skeleton,
hydration,
noBlocksInclInvalid,
noBlocks,
presentation,
)
server.app.bsky.graph.getFollowers({
auth: ctx.authVerifier.optionalStandardOrRole,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const { viewer, canViewTakedowns } = ctx.authVerifier.parseCreds(auth)
const result = await getFollowers(
{ ...params, viewer, canViewTakedowns },
{ db, actorService, graphService },
ctx,
)
return {
@ -39,95 +40,86 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db, actorService } = ctx
const { limit, cursor, actor, canViewTakedowns } = params
const { ref } = db.db.dynamic
const subject = await actorService.getActor(actor, canViewTakedowns)
if (!subject) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, followers: [], subject }
}
let followersReq = db.db
.selectFrom('follow')
.where('follow.subjectDid', '=', subject.did)
.innerJoin('actor as creator', 'creator.did', 'follow.creator')
.if(!canViewTakedowns, (qb) =>
qb.where(notSoftDeletedClause(ref('creator'))),
)
.selectAll('creator')
.select(['follow.cid as cid', 'follow.sortAt as sortAt'])
const keyset = new TimeCidKeyset(ref('follow.sortAt'), ref('follow.cid'))
followersReq = paginate(followersReq, {
limit,
cursor,
keyset,
})
const followers = await followersReq.execute()
return {
params,
followers,
subject,
cursor: keyset.packFromResult(followers),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService } = ctx
const { params, followers, subject } = state
const { viewer } = params
const [actors, bam] = await Promise.all([
actorService.views.profiles([subject, ...followers], viewer),
graphService.getBlockAndMuteState(
followers.flatMap((item) => {
if (viewer) {
return [
[viewer, item.did],
[subject.did, item.did],
]
}
return [[subject.did, item.did]]
}),
),
])
return { ...state, bam, actors }
}
const noBlocksInclInvalid = (state: HydrationState) => {
const { subject } = state
const { viewer } = state.params
state.followers = state.followers.filter(
(item) =>
!state.bam.block([subject.did, item.did]) &&
(!viewer || !state.bam.block([viewer, item.did])),
)
return state
}
const presentation = (state: HydrationState) => {
const { params, followers, subject, actors, cursor } = state
const subjectView = actors[subject.did]
const followersView = mapDefined(followers, (item) => actors[item.did])
if (!subjectView) {
const skeleton = async (input: SkeletonFnInput<Context, Params>) => {
const { params, ctx } = input
const [subjectDid] = await ctx.hydrator.actor.getDidsDefined([params.actor])
if (!subjectDid) {
throw new InvalidRequestError(`Actor not found: ${params.actor}`)
}
return { followers: followersView, subject: subjectView, cursor }
if (clearlyBadCursor(params.cursor)) {
return { subjectDid, followUris: [] }
}
const { followers, cursor } = await ctx.hydrator.graph.getActorFollowers({
did: subjectDid,
cursor: params.cursor,
limit: params.limit,
})
return {
subjectDid,
followUris: followers.map((f) => f.uri),
cursor: cursor || undefined,
}
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { followUris, subjectDid } = skeleton
const followState = await ctx.hydrator.hydrateFollows(followUris)
const dids = [subjectDid]
if (followState.follows) {
for (const [uri, follow] of followState.follows) {
if (follow) {
dids.push(didFromUri(uri))
}
}
}
const profileState = await ctx.hydrator.hydrateProfiles(dids, viewer)
return mergeStates(followState, profileState)
}
const noBlocks = (input: RulesFnInput<Context, Params, SkeletonState>) => {
const { skeleton, params, hydration, ctx } = input
const { viewer } = params
skeleton.followUris = skeleton.followUris.filter((followUri) => {
const followerDid = didFromUri(followUri)
return (
!hydration.followBlocks?.get(followUri) &&
(!viewer || !ctx.views.viewerBlockExists(followerDid, hydration))
)
})
return skeleton
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, hydration, skeleton, params } = input
const { subjectDid, followUris, cursor } = skeleton
const isTakendown = (did: string) =>
ctx.views.actorIsTakendown(did, hydration)
const subject = ctx.views.profile(subjectDid, hydration)
if (!subject || (!params.canViewTakedowns && isTakendown(subjectDid))) {
throw new InvalidRequestError(`Actor not found: ${params.actor}`)
}
const followers = mapDefined(followUris, (followUri) => {
const followerDid = didFromUri(followUri)
if (!params.canViewTakedowns && isTakendown(followerDid)) {
return
}
return ctx.views.profile(didFromUri(followUri), hydration)
})
return { followers, subject, cursor }
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -136,13 +128,7 @@ type Params = QueryParams & {
}
type SkeletonState = {
params: Params
followers: Actor[]
subject: Actor
subjectDid: string
followUris: string[]
cursor?: string
}
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
}

@ -1,34 +1,30 @@
import { mapDefined } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getFollows'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getFollowers'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { notSoftDeletedClause } from '../../../../db/util'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { Actor } from '../../../../db/tables/actor'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { createPipeline } from '../../../../pipeline'
import {
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { Hydrator, mergeStates } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getFollows = createPipeline(
skeleton,
hydration,
noBlocksInclInvalid,
presentation,
)
const getFollows = createPipeline(skeleton, hydration, noBlocks, presentation)
server.app.bsky.graph.getFollows({
auth: ctx.authVerifier.optionalStandardOrRole,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const { viewer, canViewTakedowns } = ctx.authVerifier.parseCreds(auth)
// @TODO ensure canViewTakedowns gets threaded through and applied properly
const result = await getFollows(
{ ...params, viewer, canViewTakedowns },
{ db, actorService, graphService },
ctx,
)
return {
@ -39,96 +35,89 @@ export default function (server: Server, ctx: AppContext) {
})
}
const skeleton = async (
params: Params,
ctx: Context,
): Promise<SkeletonState> => {
const { db, actorService } = ctx
const { limit, cursor, actor, canViewTakedowns } = params
const { ref } = db.db.dynamic
const creator = await actorService.getActor(actor, canViewTakedowns)
if (!creator) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, follows: [], creator }
}
let followsReq = db.db
.selectFrom('follow')
.where('follow.creator', '=', creator.did)
.innerJoin('actor as subject', 'subject.did', 'follow.subjectDid')
.if(!canViewTakedowns, (qb) =>
qb.where(notSoftDeletedClause(ref('subject'))),
)
.selectAll('subject')
.select(['follow.cid as cid', 'follow.sortAt as sortAt'])
const keyset = new TimeCidKeyset(ref('follow.sortAt'), ref('follow.cid'))
followsReq = paginate(followsReq, {
limit,
cursor,
keyset,
})
const follows = await followsReq.execute()
return {
params,
follows,
creator,
cursor: keyset.packFromResult(follows),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService } = ctx
const { params, follows, creator } = state
const { viewer } = params
const [actors, bam] = await Promise.all([
actorService.views.profiles([creator, ...follows], viewer),
graphService.getBlockAndMuteState(
follows.flatMap((item) => {
if (viewer) {
return [
[viewer, item.did],
[creator.did, item.did],
]
}
return [[creator.did, item.did]]
}),
),
])
return { ...state, bam, actors }
}
const noBlocksInclInvalid = (state: HydrationState) => {
const { creator } = state
const { viewer } = state.params
state.follows = state.follows.filter(
(item) =>
!state.bam.block([creator.did, item.did]) &&
(!viewer || !state.bam.block([viewer, item.did])),
)
return state
}
const presentation = (state: HydrationState) => {
const { params, follows, creator, actors, cursor } = state
const creatorView = actors[creator.did]
const followsView = mapDefined(follows, (item) => actors[item.did])
if (!creatorView) {
const skeleton = async (input: SkeletonFnInput<Context, Params>) => {
const { params, ctx } = input
const [subjectDid] = await ctx.hydrator.actor.getDidsDefined([params.actor])
if (!subjectDid) {
throw new InvalidRequestError(`Actor not found: ${params.actor}`)
}
return { follows: followsView, subject: creatorView, cursor }
if (clearlyBadCursor(params.cursor)) {
return { subjectDid, followUris: [] }
}
const { follows, cursor } = await ctx.hydrator.graph.getActorFollows({
did: subjectDid,
cursor: params.cursor,
limit: params.limit,
})
return {
subjectDid,
followUris: follows.map((f) => f.uri),
cursor: cursor || undefined,
}
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { followUris, subjectDid } = skeleton
const followState = await ctx.hydrator.hydrateFollows(followUris)
const dids = [subjectDid]
if (followState.follows) {
for (const follow of followState.follows.values()) {
if (follow) {
dids.push(follow.record.subject)
}
}
}
const profileState = await ctx.hydrator.hydrateProfiles(dids, viewer)
return mergeStates(followState, profileState)
}
const noBlocks = (input: RulesFnInput<Context, Params, SkeletonState>) => {
const { skeleton, params, hydration, ctx } = input
const { viewer } = params
skeleton.followUris = skeleton.followUris.filter((followUri) => {
const follow = hydration.follows?.get(followUri)
if (!follow) return false
return (
!hydration.followBlocks?.get(followUri) &&
(!viewer ||
!ctx.views.viewerBlockExists(follow.record.subject, hydration))
)
})
return skeleton
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, hydration, skeleton, params } = input
const { subjectDid, followUris, cursor } = skeleton
const isTakendown = (did: string) =>
ctx.views.actorIsTakendown(did, hydration)
const subject = ctx.views.profile(subjectDid, hydration)
if (!subject || (!params.canViewTakedowns && isTakendown(subjectDid))) {
throw new InvalidRequestError(`Actor not found: ${params.actor}`)
}
const follows = mapDefined(followUris, (followUri) => {
const followDid = hydration.follows?.get(followUri)?.record.subject
if (!followDid) return
if (!params.canViewTakedowns && isTakendown(followDid)) {
return
}
return ctx.views.profile(followDid, hydration)
})
return { follows, subject, cursor }
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -137,13 +126,7 @@ type Params = QueryParams & {
}
type SkeletonState = {
params: Params
follows: Actor[]
creator: Actor
subjectDid: string
followUris: string[]
cursor?: string
}
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
}

@ -3,28 +3,25 @@ import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getList'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { Actor } from '../../../../db/tables/actor'
import { GraphService, ListInfo } from '../../../../services/graph'
import { ActorService, ProfileHydrationState } from '../../../../services/actor'
import { createPipeline, noRules } from '../../../../pipeline'
import {
createPipeline,
HydrationFnInput,
noRules,
PresentationFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator, mergeStates } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
import { ListItemInfo } from '../../../../proto/bsky_pb'
export default function (server: Server, ctx: AppContext) {
const getList = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.graph.getList({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const graphService = ctx.services.graph(db)
const actorService = ctx.services.actor(db)
const viewer = auth.credentials.iss
const result = await getList(
{ ...params, viewer },
{ db, graphService, actorService },
)
const result = await getList({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: result,
@ -34,89 +31,60 @@ export default function (server: Server, ctx: AppContext) {
}
const skeleton = async (
params: Params,
ctx: Context,
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { db, graphService } = ctx
const { list, limit, cursor, viewer } = params
const { ref } = db.db.dynamic
const listRes = await graphService
.getListsQb(viewer)
.where('list.uri', '=', list)
.executeTakeFirst()
if (!listRes) {
throw new InvalidRequestError(`List not found: ${list}`)
const { ctx, params } = input
if (clearlyBadCursor(params.cursor)) {
return { listUri: params.list, listitems: [] }
}
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, list: listRes, listItems: [] }
}
let itemsReq = graphService
.getListItemsQb()
.where('list_item.listUri', '=', list)
.where('list_item.creator', '=', listRes.creator)
const keyset = new TimeCidKeyset(
ref('list_item.sortAt'),
ref('list_item.cid'),
)
itemsReq = paginate(itemsReq, {
limit,
cursor,
keyset,
const { listitems, cursor } = await ctx.hydrator.dataplane.getListMembers({
listUri: params.list,
limit: params.limit,
cursor: params.cursor,
})
const listItems = await itemsReq.execute()
return {
params,
list: listRes,
listItems,
cursor: keyset.packFromResult(listItems),
listUri: params.list,
listitems,
cursor: cursor || undefined,
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { actorService } = ctx
const { params, list, listItems } = state
const profileState = await actorService.views.profileHydration(
[list, ...listItems].map((x) => x.did),
{ viewer: params.viewer },
)
return { ...state, ...profileState }
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { listUri, listitems } = skeleton
const [listState, profileState] = await Promise.all([
ctx.hydrator.hydrateLists([listUri], viewer),
ctx.hydrator.hydrateProfiles(
listitems.map(({ did }) => did),
viewer,
),
])
return mergeStates(listState, profileState)
}
const presentation = (state: HydrationState, ctx: Context) => {
const { actorService, graphService } = ctx
const { params, list, listItems, cursor, ...profileState } = state
const actors = actorService.views.profilePresentation(
Object.keys(profileState.profiles),
profileState,
params.viewer,
)
const creator = actors[list.creator]
if (!creator) {
throw new InvalidRequestError(`Actor not found: ${list.handle}`)
}
const listView = graphService.formatListView(list, actors)
if (!listView) {
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, skeleton, hydration } = input
const { listUri, listitems, cursor } = skeleton
const list = ctx.views.list(listUri, hydration)
const items = mapDefined(listitems, ({ uri, did }) => {
const subject = ctx.views.profile(did, hydration)
if (!subject) return
return { uri, subject }
})
if (!list) {
throw new InvalidRequestError('List not found')
}
const items = mapDefined(listItems, (item) => {
const subject = actors[item.did]
if (!subject) return
return { uri: item.uri, subject }
})
return { list: listView, items, cursor }
return { list, items, cursor }
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -124,10 +92,7 @@ type Params = QueryParams & {
}
type SkeletonState = {
params: Params
list: Actor & ListInfo
listItems: (Actor & { uri: string; cid: string; sortAt: string })[]
listUri: string
listitems: ListItemInfo[]
cursor?: string
}
type HydrationState = SkeletonState & ProfileHydrationState

@ -1,13 +1,17 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getListBlocks'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { Actor } from '../../../../db/tables/actor'
import { GraphService, ListInfo } from '../../../../services/graph'
import { ActorService, ProfileHydrationState } from '../../../../services/actor'
import { createPipeline, noRules } from '../../../../pipeline'
import {
createPipeline,
HydrationFnInput,
noRules,
PresentationFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getListBlocks = createPipeline(
@ -19,16 +23,8 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.getListBlocks({
auth: ctx.authVerifier.standard,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const graphService = ctx.services.graph(db)
const actorService = ctx.services.actor(db)
const viewer = auth.credentials.iss
const result = await getListBlocks(
{ ...params, viewer },
{ db, actorService, graphService },
)
const result = await getListBlocks({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: result,
@ -38,72 +34,40 @@ export default function (server: Server, ctx: AppContext) {
}
const skeleton = async (
params: Params,
ctx: Context,
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { db, graphService } = ctx
const { limit, cursor, viewer } = params
const { ref } = db.db.dynamic
if (TimeCidKeyset.clearlyBad(cursor)) {
return { params, listInfos: [] }
}
let listsReq = graphService
.getListsQb(viewer)
.whereExists(
db.db
.selectFrom('list_block')
.where('list_block.creator', '=', viewer)
.whereRef('list_block.subjectUri', '=', ref('list.uri'))
.selectAll(),
)
const keyset = new TimeCidKeyset(ref('list.createdAt'), ref('list.cid'))
listsReq = paginate(listsReq, {
limit,
cursor,
keyset,
})
const listInfos = await listsReq.execute()
return {
params,
listInfos,
cursor: keyset.packFromResult(listInfos),
const { ctx, params } = input
if (clearlyBadCursor(params.cursor)) {
return { listUris: [] }
}
const { listUris, cursor } =
await ctx.hydrator.dataplane.getBlocklistSubscriptions({
actorDid: params.viewer,
cursor: params.cursor,
limit: params.limit,
})
return { listUris, cursor: cursor || undefined }
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { actorService } = ctx
const { params, listInfos } = state
const profileState = await actorService.views.profileHydration(
listInfos.map((list) => list.creator),
{ viewer: params.viewer },
)
return { ...state, ...profileState }
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
return await ctx.hydrator.hydrateLists(skeleton.listUris, params.viewer)
}
const presentation = (state: HydrationState, ctx: Context) => {
const { actorService, graphService } = ctx
const { params, listInfos, cursor, ...profileState } = state
const actors = actorService.views.profilePresentation(
Object.keys(profileState.profiles),
profileState,
params.viewer,
)
const lists = mapDefined(listInfos, (list) =>
graphService.formatListView(list, actors),
)
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, skeleton, hydration } = input
const { listUris, cursor } = skeleton
const lists = mapDefined(listUris, (uri) => ctx.views.list(uri, hydration))
return { lists, cursor }
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -111,9 +75,6 @@ type Params = QueryParams & {
}
type SkeletonState = {
params: Params
listInfos: (Actor & ListInfo)[]
listUris: string[]
cursor?: string
}
type HydrationState = SkeletonState & ProfileHydrationState

@ -1,58 +1,80 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getListBlocks'
import AppContext from '../../../../context'
import {
createPipeline,
HydrationFnInput,
noRules,
PresentationFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getListMutes = createPipeline(
skeleton,
hydration,
noRules,
presentation,
)
server.app.bsky.graph.getListMutes({
auth: ctx.authVerifier.standard,
handler: async ({ params, auth }) => {
const { limit, cursor } = params
const requester = auth.credentials.iss
if (TimeCidKeyset.clearlyBad(cursor)) {
return {
encoding: 'application/json',
body: { lists: [] },
}
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
const graphService = ctx.services.graph(db)
let listsReq = graphService
.getListsQb(requester)
.whereExists(
db.db
.selectFrom('list_mute')
.where('list_mute.mutedByDid', '=', requester)
.whereRef('list_mute.listUri', '=', ref('list.uri'))
.selectAll(),
)
const keyset = new TimeCidKeyset(ref('list.createdAt'), ref('list.cid'))
listsReq = paginate(listsReq, {
limit,
cursor,
keyset,
})
const listsRes = await listsReq.execute()
const actorService = ctx.services.actor(db)
const profiles = await actorService.views.profiles(listsRes, requester)
const lists = mapDefined(listsRes, (row) =>
graphService.formatListView(row, profiles),
)
const viewer = auth.credentials.iss
const result = await getListMutes({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
lists,
cursor: keyset.packFromResult(listsRes),
},
body: result,
}
},
})
}
const skeleton = async (
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { ctx, params } = input
if (clearlyBadCursor(params.cursor)) {
return { listUris: [] }
}
const { listUris, cursor } =
await ctx.hydrator.dataplane.getMutelistSubscriptions({
actorDid: params.viewer,
cursor: params.cursor,
limit: params.limit,
})
return { listUris, cursor: cursor || undefined }
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
return await ctx.hydrator.hydrateLists(skeleton.listUris, params.viewer)
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, skeleton, hydration } = input
const { listUris, cursor } = skeleton
const lists = mapDefined(listUris, (uri) => ctx.views.list(uri, hydration))
return { lists, cursor }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string
}
type SkeletonState = {
listUris: string[]
cursor?: string
}

@ -1,63 +1,79 @@
import { mapDefined } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getLists'
import AppContext from '../../../../context'
import {
createPipeline,
HydrationFnInput,
noRules,
PresentationFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getLists = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.graph.getLists({
auth: ctx.authVerifier.standardOptional,
handler: async ({ params, auth }) => {
const { actor, limit, cursor } = params
const requester = auth.credentials.iss
if (TimeCidKeyset.clearlyBad(cursor)) {
return {
encoding: 'application/json',
body: { lists: [] },
}
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const creatorRes = await actorService.getActor(actor)
if (!creatorRes) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
let listsReq = graphService
.getListsQb(requester)
.where('list.creator', '=', creatorRes.did)
const keyset = new TimeCidKeyset(ref('list.sortAt'), ref('list.cid'))
listsReq = paginate(listsReq, {
limit,
cursor,
keyset,
})
const [listsRes, profiles] = await Promise.all([
listsReq.execute(),
actorService.views.profiles([creatorRes], requester),
])
if (!profiles[creatorRes.did]) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
const lists = mapDefined(listsRes, (row) =>
graphService.formatListView(row, profiles),
)
const viewer = auth.credentials.iss
const result = await getLists({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
lists,
cursor: keyset.packFromResult(listsRes),
},
body: result,
}
},
})
}
const skeleton = async (
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { ctx, params } = input
if (clearlyBadCursor(params.cursor)) {
return { listUris: [] }
}
const { listUris, cursor } = await ctx.hydrator.dataplane.getActorLists({
actorDid: params.actor,
cursor: params.cursor,
limit: params.limit,
})
return { listUris, cursor: cursor || undefined }
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { listUris } = skeleton
return ctx.hydrator.hydrateLists(listUris, viewer)
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, skeleton, hydration } = input
const { listUris, cursor } = skeleton
const lists = mapDefined(listUris, (uri) => {
return ctx.views.list(uri, hydration)
})
return { lists, cursor }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string | null
}
type SkeletonState = {
listUris: string[]
cursor?: string
}

@ -1,62 +1,79 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getMutes'
import AppContext from '../../../../context'
import { notSoftDeletedClause } from '../../../../db/util'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import {
HydrationFnInput,
PresentationFnInput,
SkeletonFnInput,
createPipeline,
noRules,
} from '../../../../pipeline'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const getMutes = createPipeline(skeleton, hydration, noRules, presentation)
server.app.bsky.graph.getMutes({
auth: ctx.authVerifier.standard,
handler: async ({ params, auth }) => {
const { limit, cursor } = params
const requester = auth.credentials.iss
if (TimeCidKeyset.clearlyBad(cursor)) {
return {
encoding: 'application/json',
body: { mutes: [] },
}
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
let mutesReq = db.db
.selectFrom('mute')
.innerJoin('actor', 'actor.did', 'mute.subjectDid')
.where(notSoftDeletedClause(ref('actor')))
.where('mute.mutedByDid', '=', requester)
.selectAll('actor')
.select('mute.createdAt as createdAt')
const keyset = new CreatedAtDidKeyset(
ref('mute.createdAt'),
ref('mute.subjectDid'),
)
mutesReq = paginate(mutesReq, {
limit,
cursor,
keyset,
})
const mutesRes = await mutesReq.execute()
const actorService = ctx.services.actor(db)
const viewer = auth.credentials.iss
const result = await getMutes({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: {
cursor: keyset.packFromResult(mutesRes),
mutes: await actorService.views.profilesList(mutesRes, requester),
},
body: result,
}
},
})
}
export class CreatedAtDidKeyset extends TimeCidKeyset<{
createdAt: string
did: string // dids are treated identically to cids in TimeCidKeyset
}> {
labelResult(result: { createdAt: string; did: string }) {
return { primary: result.createdAt, secondary: result.did }
const skeleton = async (input: SkeletonFnInput<Context, Params>) => {
const { params, ctx } = input
if (clearlyBadCursor(params.cursor)) {
return { mutedDids: [] }
}
const { dids, cursor } = await ctx.hydrator.dataplane.getMutes({
actorDid: params.viewer,
cursor: params.cursor,
limit: params.limit,
})
return {
mutedDids: dids,
cursor: cursor || undefined,
}
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { mutedDids } = skeleton
return ctx.hydrator.hydrateProfiles(mutedDids, viewer)
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, hydration, skeleton } = input
const { mutedDids, cursor } = skeleton
const mutes = mapDefined(mutedDids, (did) => {
return ctx.views.profile(did, hydration)
})
return { mutes, cursor }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string
}
type SkeletonState = {
mutedDids: string[]
cursor?: string
}

@ -1,6 +1,5 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { Relationship } from '../../../../lexicon/types/app/bsky/graph/defs'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.getRelationships({
@ -15,42 +14,18 @@ export default function (server: Server, ctx: AppContext) {
},
}
}
const db = ctx.db.getPrimary()
const { ref } = db.db.dynamic
const res = await db.db
.selectFrom('actor')
.select([
'actor.did',
db.db
.selectFrom('follow')
.where('creator', '=', actor)
.whereRef('subjectDid', '=', ref('actor.did'))
.select('uri')
.as('following'),
db.db
.selectFrom('follow')
.whereRef('creator', '=', ref('actor.did'))
.where('subjectDid', '=', actor)
.select('uri')
.as('followedBy'),
])
.where('actor.did', 'in', others)
.execute()
const relationshipsMap = res.reduce((acc, cur) => {
return acc.set(cur.did, {
did: cur.did,
following: cur.following ?? undefined,
followedBy: cur.followedBy ?? undefined,
})
}, new Map<string, Relationship>())
const res = await ctx.hydrator.actor.getProfileViewerStatesNaive(
others,
actor,
)
const relationships = others.map((did) => {
const relationship = relationshipsMap.get(did)
return relationship
const subject = res.get(did)
return subject
? {
$type: 'app.bsky.graph.defs#relationship',
...relationship,
did,
following: subject.following,
followedBy: subject.followedBy,
}
: {
$type: 'app.bsky.graph.defs#notFoundActor',
@ -58,7 +33,6 @@ export default function (server: Server, ctx: AppContext) {
notFound: true,
}
})
return {
encoding: 'application/json',
body: {

@ -1,139 +1,98 @@
import { sql } from 'kysely'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { mapDefined } from '@atproto/common'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Database } from '../../../../db'
import { ActorService } from '../../../../services/actor'
const RESULT_LENGTH = 10
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/graph/getSuggestedFollowsByActor'
import AppContext from '../../../../context'
import {
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
createPipeline,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
export default function (server: Server, ctx: AppContext) {
const getSuggestedFollowsByActor = createPipeline(
skeleton,
hydration,
noBlocksOrMutes,
presentation,
)
server.app.bsky.graph.getSuggestedFollowsByActor({
auth: ctx.authVerifier.standard,
handler: async ({ auth, params }) => {
const { actor } = params
const viewer = auth.credentials.iss
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const actorDid = await actorService.getActorDid(actor)
if (!actorDid) {
throw new InvalidRequestError('Actor not found')
}
const skeleton = await getSkeleton(
{
actor: actorDid,
viewer,
},
{
db,
actorService,
},
const result = await getSuggestedFollowsByActor(
{ ...params, viewer },
ctx,
)
const hydrationState = await actorService.views.profileDetailHydration(
skeleton.map((a) => a.did),
{ viewer },
)
const presentationState = actorService.views.profileDetailPresentation(
skeleton.map((a) => a.did),
hydrationState,
{ viewer },
)
const suggestions = Object.values(presentationState).filter((profile) => {
return (
!profile.viewer?.muted &&
!profile.viewer?.mutedByList &&
!profile.viewer?.blocking &&
!profile.viewer?.blockedBy
)
})
return {
encoding: 'application/json',
body: { suggestions },
body: result,
}
},
})
}
async function getSkeleton(
params: {
actor: string
viewer: string
},
ctx: {
db: Database
actorService: ActorService
},
): Promise<{ did: string }[]> {
const actorsViewerFollows = ctx.db.db
.selectFrom('follow')
.where('creator', '=', params.viewer)
.select('subjectDid')
const mostLikedAccounts = await ctx.db.db
.selectFrom(
ctx.db.db
.selectFrom('like')
.where('creator', '=', params.actor)
.select(sql`split_part(subject, '/', 3)`.as('subjectDid'))
.orderBy('sortAt', 'desc')
.limit(1000) // limit to 1000
.as('likes'),
)
.select('likes.subjectDid as did')
.select((qb) => qb.fn.count('likes.subjectDid').as('count'))
.where('likes.subjectDid', 'not in', actorsViewerFollows)
.where('likes.subjectDid', 'not in', [params.actor, params.viewer])
.groupBy('likes.subjectDid')
.orderBy('count', 'desc')
.limit(RESULT_LENGTH)
.execute()
const resultDids = mostLikedAccounts.map((a) => ({ did: a.did })) as {
did: string
}[]
if (resultDids.length < RESULT_LENGTH) {
// backfill with popular accounts followed by actor
const mostPopularAccountsActorFollows = await ctx.db.db
.selectFrom('follow')
.innerJoin('profile_agg', 'follow.subjectDid', 'profile_agg.did')
.select('follow.subjectDid as did')
.where('follow.creator', '=', params.actor)
.where('follow.subjectDid', '!=', params.viewer)
.where('follow.subjectDid', 'not in', actorsViewerFollows)
.if(resultDids.length > 0, (qb) =>
qb.where(
'subjectDid',
'not in',
resultDids.map((a) => a.did),
),
)
.orderBy('profile_agg.followersCount', 'desc')
.limit(RESULT_LENGTH)
.execute()
resultDids.push(...mostPopularAccountsActorFollows)
const skeleton = async (input: SkeletonFnInput<Context, Params>) => {
const { params, ctx } = input
const [relativeToDid] = await ctx.hydrator.actor.getDids([params.actor])
if (!relativeToDid) {
throw new InvalidRequestError('Actor not found')
}
if (resultDids.length < RESULT_LENGTH) {
// backfill with suggested_follow table
const additional = await ctx.db.db
.selectFrom('suggested_follow')
.where(
'did',
'not in',
// exclude any we already have
resultDids.map((a) => a.did).concat([params.actor, params.viewer]),
)
// and aren't already followed by viewer
.where('did', 'not in', actorsViewerFollows)
.selectAll()
.execute()
resultDids.push(...additional)
const { dids, cursor } = await ctx.hydrator.dataplane.getFollowSuggestions({
actorDid: params.viewer,
relativeToDid,
})
return {
suggestedDids: dids,
cursor: cursor || undefined,
}
return resultDids
}
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, params, skeleton } = input
const { viewer } = params
const { suggestedDids } = skeleton
return ctx.hydrator.hydrateProfilesDetailed(suggestedDids, viewer)
}
const noBlocksOrMutes = (
input: RulesFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, skeleton, hydration } = input
skeleton.suggestedDids = skeleton.suggestedDids.filter(
(did) =>
!ctx.views.viewerBlockExists(did, hydration) &&
!ctx.views.viewerMuteExists(did, hydration),
)
return skeleton
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { ctx, hydration, skeleton } = input
const { suggestedDids } = skeleton
const suggestions = mapDefined(suggestedDids, (did) =>
ctx.views.profileDetailed(did, hydration),
)
return { suggestions }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string
}
type SkeletonState = {
suggestedDids: string[]
}

@ -1,9 +1,7 @@
import assert from 'node:assert'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { MuteOperation_Type } from '../../../../proto/bsync_pb'
import { BsyncClient } from '../../../../bsync'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.muteActor({
@ -11,44 +9,13 @@ export default function (server: Server, ctx: AppContext) {
handler: async ({ req, auth, input }) => {
const { actor } = input.body
const requester = auth.credentials.iss
const db = ctx.db.getPrimary()
const subjectDid = await ctx.services.actor(db).getActorDid(actor)
if (!subjectDid) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
if (subjectDid === requester) {
throw new InvalidRequestError('Cannot mute oneself')
}
const muteActor = async () => {
await ctx.services.graph(db).muteActor({
subjectDid,
mutedByDid: requester,
})
}
const addBsyncMuteOp = async (bsyncClient: BsyncClient) => {
await bsyncClient.addMuteOperation({
type: MuteOperation_Type.ADD,
actorDid: requester,
subject: subjectDid,
})
}
if (ctx.cfg.bsyncOnlyMutes) {
assert(ctx.bsyncClient)
await addBsyncMuteOp(ctx.bsyncClient)
} else {
await muteActor()
if (ctx.bsyncClient) {
try {
await addBsyncMuteOp(ctx.bsyncClient)
} catch (err) {
req.log.warn(err, 'failed to sync mute op to bsync')
}
}
}
const [did] = await ctx.hydrator.actor.getDids([actor])
if (!did) throw new InvalidRequestError('Actor not found')
await ctx.bsyncClient.addMuteOperation({
type: MuteOperation_Type.ADD,
actorDid: requester,
subject: did,
})
},
})
}

@ -1,55 +1,18 @@
import assert from 'node:assert'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import * as lex from '../../../../lexicon/lexicons'
import AppContext from '../../../../context'
import { AtUri } from '@atproto/syntax'
import { MuteOperation_Type } from '../../../../proto/bsync_pb'
import { BsyncClient } from '../../../../bsync'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.muteActorList({
auth: ctx.authVerifier.standard,
handler: async ({ req, auth, input }) => {
handler: async ({ auth, input }) => {
const { list } = input.body
const requester = auth.credentials.iss
const db = ctx.db.getPrimary()
const listUri = new AtUri(list)
const collId = lex.ids.AppBskyGraphList
if (listUri.collection !== collId) {
throw new InvalidRequestError(`Invalid collection: expected: ${collId}`)
}
const muteActorList = async () => {
await ctx.services.graph(db).muteActorList({
list,
mutedByDid: requester,
})
}
const addBsyncMuteOp = async (bsyncClient: BsyncClient) => {
await bsyncClient.addMuteOperation({
type: MuteOperation_Type.ADD,
actorDid: requester,
subject: list,
})
}
if (ctx.cfg.bsyncOnlyMutes) {
assert(ctx.bsyncClient)
await addBsyncMuteOp(ctx.bsyncClient)
} else {
await muteActorList()
if (ctx.bsyncClient) {
try {
await addBsyncMuteOp(ctx.bsyncClient)
} catch (err) {
req.log.warn(err, 'failed to sync mute op to bsync')
}
}
}
await ctx.bsyncClient.addMuteOperation({
type: MuteOperation_Type.ADD,
actorDid: requester,
subject: list,
})
},
})
}

@ -1,54 +1,21 @@
import assert from 'node:assert'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { MuteOperation_Type } from '../../../../proto/bsync_pb'
import { BsyncClient } from '../../../../bsync'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.unmuteActor({
auth: ctx.authVerifier.standard,
handler: async ({ req, auth, input }) => {
handler: async ({ auth, input }) => {
const { actor } = input.body
const requester = auth.credentials.iss
const db = ctx.db.getPrimary()
const subjectDid = await ctx.services.actor(db).getActorDid(actor)
if (!subjectDid) {
throw new InvalidRequestError(`Actor not found: ${actor}`)
}
if (subjectDid === requester) {
throw new InvalidRequestError('Cannot mute oneself')
}
const unmuteActor = async () => {
await ctx.services.graph(db).unmuteActor({
subjectDid,
mutedByDid: requester,
})
}
const addBsyncMuteOp = async (bsyncClient: BsyncClient) => {
await bsyncClient.addMuteOperation({
type: MuteOperation_Type.REMOVE,
actorDid: requester,
subject: subjectDid,
})
}
if (ctx.cfg.bsyncOnlyMutes) {
assert(ctx.bsyncClient)
await addBsyncMuteOp(ctx.bsyncClient)
} else {
await unmuteActor()
if (ctx.bsyncClient) {
try {
await addBsyncMuteOp(ctx.bsyncClient)
} catch (err) {
req.log.warn(err, 'failed to sync mute op to bsync')
}
}
}
const [did] = await ctx.hydrator.actor.getDids([actor])
if (!did) throw new InvalidRequestError('Actor not found')
await ctx.bsyncClient.addMuteOperation({
type: MuteOperation_Type.REMOVE,
actorDid: requester,
subject: did,
})
},
})
}

@ -1,45 +1,18 @@
import assert from 'node:assert'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { MuteOperation_Type } from '../../../../proto/bsync_pb'
import { BsyncClient } from '../../../../bsync'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.graph.unmuteActorList({
auth: ctx.authVerifier.standard,
handler: async ({ req, auth, input }) => {
handler: async ({ auth, input }) => {
const { list } = input.body
const requester = auth.credentials.iss
const db = ctx.db.getPrimary()
const unmuteActorList = async () => {
await ctx.services.graph(db).unmuteActorList({
list,
mutedByDid: requester,
})
}
const addBsyncMuteOp = async (bsyncClient: BsyncClient) => {
await bsyncClient.addMuteOperation({
type: MuteOperation_Type.REMOVE,
actorDid: requester,
subject: list,
})
}
if (ctx.cfg.bsyncOnlyMutes) {
assert(ctx.bsyncClient)
await addBsyncMuteOp(ctx.bsyncClient)
} else {
await unmuteActorList()
if (ctx.bsyncClient) {
try {
await addBsyncMuteOp(ctx.bsyncClient)
} catch (err) {
req.log.warn(err, 'failed to sync mute op to bsync')
}
}
}
await ctx.bsyncClient.addMuteOperation({
type: MuteOperation_Type.REMOVE,
actorDid: requester,
subject: list,
})
},
})
}

@ -1,43 +1,74 @@
import { sql } from 'kysely'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import { countAll, notSoftDeletedClause } from '../../../../db/util'
import { QueryParams } from '../../../../lexicon/types/app/bsky/notification/getUnreadCount'
import AppContext from '../../../../context'
import {
HydrationFnInput,
PresentationFnInput,
SkeletonFnInput,
createPipeline,
noRules,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
export default function (server: Server, ctx: AppContext) {
const getUnreadCount = createPipeline(
skeleton,
hydration,
noRules,
presentation,
)
server.app.bsky.notification.getUnreadCount({
auth: ctx.authVerifier.standard,
handler: async ({ auth, params }) => {
const requester = auth.credentials.iss
if (params.seenAt) {
throw new InvalidRequestError('The seenAt parameter is unsupported')
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
const result = await db.db
.selectFrom('notification')
.select(countAll.as('count'))
.innerJoin('actor', 'actor.did', 'notification.did')
.leftJoin('actor_state', 'actor_state.did', 'actor.did')
.innerJoin('record', 'record.uri', 'notification.recordUri')
.where(notSoftDeletedClause(ref('actor')))
.where(notSoftDeletedClause(ref('record')))
// Ensure to hit notification_did_sortat_idx, handling case where lastSeenNotifs is null.
.where('notification.did', '=', requester)
.where(
'notification.sortAt',
'>',
sql`coalesce(${ref('actor_state.lastSeenNotifs')}, ${''})`,
)
.executeTakeFirst()
const count = result?.count ?? 0
const viewer = auth.credentials.iss
const result = await getUnreadCount({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: { count },
body: result,
}
},
})
}
const skeleton = async (
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { params, ctx } = input
if (params.seenAt) {
throw new InvalidRequestError('The seenAt parameter is unsupported')
}
const res = await ctx.hydrator.dataplane.getUnreadNotificationCount({
actorDid: params.viewer,
})
return {
count: res.count,
}
}
const hydration = async (
_input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
return {}
}
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { skeleton } = input
return { count: skeleton.count }
}
type Context = {
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
viewer: string
}
type SkeletonState = {
count: number
}

@ -1,16 +1,20 @@
import { InvalidRequestError } from '@atproto/xrpc-server'
import { jsonStringToLex } from '@atproto/lexicon'
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import { QueryParams } from '../../../../lexicon/types/app/bsky/notification/listNotifications'
import AppContext from '../../../../context'
import { Database } from '../../../../db'
import { notSoftDeletedClause } from '../../../../db/util'
import { paginate, TimeCidKeyset } from '../../../../db/pagination'
import { BlockAndMuteState, GraphService } from '../../../../services/graph'
import { ActorInfoMap, ActorService } from '../../../../services/actor'
import { getSelfLabels, Labels, LabelService } from '../../../../services/label'
import { createPipeline } from '../../../../pipeline'
import {
createPipeline,
HydrationFnInput,
PresentationFnInput,
RulesFnInput,
SkeletonFnInput,
} from '../../../../pipeline'
import { Hydrator } from '../../../../hydration/hydrator'
import { Views } from '../../../../views'
import { Notification } from '../../../../proto/bsky_pb'
import { didFromUri } from '../../../../hydration/util'
import { clearlyBadCursor } from '../../../util'
export default function (server: Server, ctx: AppContext) {
const listNotifications = createPipeline(
@ -22,17 +26,8 @@ export default function (server: Server, ctx: AppContext) {
server.app.bsky.notification.listNotifications({
auth: ctx.authVerifier.standard,
handler: async ({ params, auth }) => {
const db = ctx.db.getReplica()
const actorService = ctx.services.actor(db)
const graphService = ctx.services.graph(db)
const labelService = ctx.services.label(db)
const viewer = auth.credentials.iss
const result = await listNotifications(
{ ...params, viewer },
{ db, actorService, graphService, labelService },
)
const result = await listNotifications({ ...params, viewer }, ctx)
return {
encoding: 'application/json',
body: result,
@ -42,141 +37,73 @@ export default function (server: Server, ctx: AppContext) {
}
const skeleton = async (
params: Params,
ctx: Context,
input: SkeletonFnInput<Context, Params>,
): Promise<SkeletonState> => {
const { db } = ctx
const { limit, cursor, viewer } = params
const { ref } = db.db.dynamic
const { params, ctx } = input
if (params.seenAt) {
throw new InvalidRequestError('The seenAt parameter is unsupported')
}
if (NotifsKeyset.clearlyBad(cursor)) {
return { params, notifs: [] }
if (clearlyBadCursor(params.cursor)) {
return { notifs: [] }
}
let notifBuilder = db.db
.selectFrom('notification as notif')
.where('notif.did', '=', viewer)
.where((clause) =>
clause
.where('reasonSubject', 'is', null)
.orWhereExists(
db.db
.selectFrom('record as subject')
.selectAll()
.whereRef('subject.uri', '=', ref('notif.reasonSubject')),
),
)
.select([
'notif.author as authorDid',
'notif.recordUri as uri',
'notif.recordCid as cid',
'notif.reason as reason',
'notif.reasonSubject as reasonSubject',
'notif.sortAt as indexedAt',
])
const keyset = new NotifsKeyset(ref('notif.sortAt'), ref('notif.recordCid'))
notifBuilder = paginate(notifBuilder, {
cursor,
limit,
keyset,
tryIndex: true,
})
const actorStateQuery = db.db
.selectFrom('actor_state')
.selectAll()
.where('did', '=', viewer)
const [notifs, actorState] = await Promise.all([
notifBuilder.execute(),
actorStateQuery.executeTakeFirst(),
const [res, lastSeenRes] = await Promise.all([
ctx.hydrator.dataplane.getNotifications({
actorDid: params.viewer,
cursor: params.cursor,
limit: params.limit,
}),
ctx.hydrator.dataplane.getNotificationSeen({
actorDid: params.viewer,
}),
])
// @NOTE for the first page of results if there's no last-seen time, consider top notification unread
// rather than all notifications. bit of a hack to be more graceful when seen times are out of sync.
let lastSeenDate = lastSeenRes.timestamp?.toDate()
if (!lastSeenDate && !params.cursor) {
lastSeenDate = res.notifications.at(0)?.timestamp?.toDate()
}
return {
params,
notifs,
cursor: keyset.packFromResult(notifs),
lastSeenNotifs: actorState?.lastSeenNotifs,
notifs: res.notifications,
cursor: res.cursor || undefined,
lastSeenNotifs: lastSeenDate?.toISOString(),
}
}
const hydration = async (state: SkeletonState, ctx: Context) => {
const { graphService, actorService, labelService, db } = ctx
const { params, notifs } = state
const { viewer } = params
const dids = notifs.map((notif) => notif.authorDid)
const uris = notifs.map((notif) => notif.uri)
const [actors, records, labels, bam] = await Promise.all([
actorService.views.profiles(dids, viewer),
getRecordMap(db, uris),
labelService.getLabelsForUris(uris),
graphService.getBlockAndMuteState(dids.map((did) => [viewer, did])),
])
return { ...state, actors, records, labels, bam }
const hydration = async (
input: HydrationFnInput<Context, Params, SkeletonState>,
) => {
const { skeleton, params, ctx } = input
return ctx.hydrator.hydrateNotifications(skeleton.notifs, params.viewer)
}
const noBlockOrMutes = (state: HydrationState) => {
const { viewer } = state.params
state.notifs = state.notifs.filter(
(item) =>
!state.bam.block([viewer, item.authorDid]) &&
!state.bam.mute([viewer, item.authorDid]),
)
return state
}
const presentation = (state: HydrationState) => {
const { notifs, cursor, actors, records, labels, lastSeenNotifs } = state
const notifications = mapDefined(notifs, (notif) => {
const author = actors[notif.authorDid]
const record = records[notif.uri]
if (!author || !record) return undefined
const recordLabels = labels[notif.uri] ?? []
const recordSelfLabels = getSelfLabels({
uri: notif.uri,
cid: notif.cid,
record,
})
return {
uri: notif.uri,
cid: notif.cid,
author,
reason: notif.reason,
reasonSubject: notif.reasonSubject || undefined,
record,
isRead: lastSeenNotifs ? notif.indexedAt <= lastSeenNotifs : false,
indexedAt: notif.indexedAt,
labels: [...recordLabels, ...recordSelfLabels],
}
const noBlockOrMutes = (
input: RulesFnInput<Context, Params, SkeletonState>,
) => {
const { skeleton, hydration, ctx } = input
skeleton.notifs = skeleton.notifs.filter((item) => {
const did = didFromUri(item.uri)
return (
!ctx.views.viewerBlockExists(did, hydration) &&
!ctx.views.viewerMuteExists(did, hydration)
)
})
return { notifications, cursor, seenAt: lastSeenNotifs }
return skeleton
}
const getRecordMap = async (
db: Database,
uris: string[],
): Promise<RecordMap> => {
if (!uris.length) return {}
const { ref } = db.db.dynamic
const recordRows = await db.db
.selectFrom('record')
.select(['uri', 'json'])
.where('uri', 'in', uris)
.where(notSoftDeletedClause(ref('record')))
.execute()
return recordRows.reduce((acc, { uri, json }) => {
acc[uri] = jsonStringToLex(json) as Record<string, unknown>
return acc
}, {} as RecordMap)
const presentation = (
input: PresentationFnInput<Context, Params, SkeletonState>,
) => {
const { skeleton, hydration, ctx } = input
const { notifs, lastSeenNotifs, cursor } = skeleton
const notifications = mapDefined(notifs, (notif) =>
ctx.views.notification(notif, lastSeenNotifs, hydration),
)
return { notifications, cursor, seenAt: skeleton.lastSeenNotifs }
}
type Context = {
db: Database
actorService: ActorService
graphService: GraphService
labelService: LabelService
hydrator: Hydrator
views: Views
}
type Params = QueryParams & {
@ -184,32 +111,7 @@ type Params = QueryParams & {
}
type SkeletonState = {
params: Params
notifs: NotifRow[]
notifs: Notification[]
lastSeenNotifs?: string
cursor?: string
}
type HydrationState = SkeletonState & {
bam: BlockAndMuteState
actors: ActorInfoMap
records: RecordMap
labels: Labels
}
type RecordMap = { [uri: string]: Record<string, unknown> }
type NotifRow = {
authorDid: string
uri: string
cid: string
reason: string
reasonSubject: string | null
indexedAt: string
}
class NotifsKeyset extends TimeCidKeyset<NotifRow> {
labelResult(result: NotifRow) {
return { primary: result.indexedAt, secondary: result.cid }
}
}

@ -1,15 +1,12 @@
import assert from 'node:assert'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { Platform } from '../../../../notifications'
import { CourierClient } from '../../../../courier'
import { AppPlatform } from '../../../../proto/courier_pb'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.notification.registerPush({
auth: ctx.authVerifier.standard,
handler: async ({ req, auth, input }) => {
handler: async ({ auth, input }) => {
const { token, platform, serviceDid, appId } = input.body
const did = auth.credentials.iss
if (serviceDid !== auth.credentials.aud) {
@ -20,44 +17,17 @@ export default function (server: Server, ctx: AppContext) {
'Unsupported platform: must be "ios", "android", or "web".',
)
}
const db = ctx.db.getPrimary()
const registerDeviceWithAppview = async () => {
await ctx.services
.actor(db)
.registerPushDeviceToken(did, token, platform as Platform, appId)
}
const registerDeviceWithCourier = async (
courierClient: CourierClient,
) => {
await courierClient.registerDeviceToken({
did,
token,
platform:
platform === 'ios'
? AppPlatform.IOS
: platform === 'android'
? AppPlatform.ANDROID
: AppPlatform.WEB,
appId,
})
}
if (ctx.cfg.courierOnlyRegistration) {
assert(ctx.courierClient)
await registerDeviceWithCourier(ctx.courierClient)
} else {
await registerDeviceWithAppview()
if (ctx.courierClient) {
try {
await registerDeviceWithCourier(ctx.courierClient)
} catch (err) {
req.log.warn(err, 'failed to register device token with courier')
}
}
}
await ctx.courierClient.registerDeviceToken({
did,
token,
platform:
platform === 'ios'
? AppPlatform.IOS
: platform === 'android'
? AppPlatform.ANDROID
: AppPlatform.WEB,
appId,
})
},
})
}

@ -1,7 +1,6 @@
import { Timestamp } from '@bufbuild/protobuf'
import { Server } from '../../../../lexicon'
import { InvalidRequestError } from '@atproto/xrpc-server'
import AppContext from '../../../../context'
import { excluded } from '../../../../db/util'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.notification.updateSeen({
@ -9,25 +8,10 @@ export default function (server: Server, ctx: AppContext) {
handler: async ({ input, auth }) => {
const { seenAt } = input.body
const viewer = auth.credentials.iss
let parsed: string
try {
parsed = new Date(seenAt).toISOString()
} catch (_err) {
throw new InvalidRequestError('Invalid date')
}
const db = ctx.db.getPrimary()
await db.db
.insertInto('actor_state')
.values({ did: viewer, lastSeenNotifs: parsed })
.onConflict((oc) =>
oc.column('did').doUpdateSet({
lastSeenNotifs: excluded(db.db, 'lastSeenNotifs'),
}),
)
.executeTakeFirst()
await ctx.dataplane.updateNotificationSeen({
actorDid: viewer,
timestamp: Timestamp.fromDate(new Date(seenAt)),
})
},
})
}

@ -1,107 +1,57 @@
import { mapDefined } from '@atproto/common'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { countAll } from '../../../../db/util'
import { GenericKeyset, paginate } from '../../../../db/pagination'
import { InvalidRequestError } from '@atproto/xrpc-server'
import { GeneratorView } from '../../../../lexicon/types/app/bsky/feed/defs'
import { parseString } from '../../../../hydration/util'
import { clearlyBadCursor } from '../../../util'
// THIS IS A TEMPORARY UNSPECCED ROUTE
// @TODO currently mirrors getSuggestedFeeds and ignores the "query" param.
// In the future may take into consideration popularity via likes w/ its own dataplane endpoint.
export default function (server: Server, ctx: AppContext) {
server.app.bsky.unspecced.getPopularFeedGenerators({
auth: ctx.authVerifier.standardOptional,
handler: async ({ auth, params }) => {
const { limit, cursor, query } = params
const requester = auth.credentials.iss
if (LikeCountKeyset.clearlyBad(cursor)) {
const viewer = auth.credentials.iss
if (clearlyBadCursor(params.cursor)) {
return {
encoding: 'application/json',
body: { feeds: [] },
}
}
const db = ctx.db.getReplica()
const { ref } = db.db.dynamic
const feedService = ctx.services.feed(db)
const actorService = ctx.services.actor(db)
let inner = db.db
.selectFrom('feed_generator')
.select([
'uri',
'cid',
db.db
.selectFrom('like')
.whereRef('like.subject', '=', ref('feed_generator.uri'))
.select(countAll.as('count'))
.as('likeCount'),
])
let uris: string[]
let cursor: string | undefined
const query = params.query?.trim() ?? ''
if (query) {
inner = inner.where((qb) =>
qb
.where('feed_generator.displayName', 'ilike', `%${query}%`)
.orWhere('feed_generator.description', 'ilike', `%${query}%`),
)
const res = await ctx.dataplane.searchFeedGenerators({
query,
limit: params.limit,
})
uris = res.uris
} else {
const res = await ctx.dataplane.getSuggestedFeeds({
actorDid: viewer ?? undefined,
limit: params.limit,
cursor: params.cursor,
})
uris = res.uris
cursor = parseString(res.cursor)
}
let builder = db.db.selectFrom(inner.as('feed_gens')).selectAll()
const keyset = new LikeCountKeyset(ref('likeCount'), ref('cid'))
builder = paginate(builder, { limit, cursor, keyset, direction: 'desc' })
const res = await builder.execute()
const genInfos = await feedService.getFeedGeneratorInfos(
res.map((feed) => feed.uri),
requester,
const hydration = await ctx.hydrator.hydrateFeedGens(uris, viewer)
const feedViews = mapDefined(uris, (uri) =>
ctx.views.feedGenerator(uri, hydration),
)
const creators = Object.values(genInfos).map((gen) => gen.creator)
const profiles = await actorService.views.profiles(creators, requester)
const genViews: GeneratorView[] = []
for (const row of res) {
const gen = genInfos[row.uri]
if (!gen) continue
const view = feedService.views.formatFeedGeneratorView(gen, profiles)
if (view) {
genViews.push(view)
}
}
return {
encoding: 'application/json',
body: {
cursor: keyset.packFromResult(res),
feeds: genViews,
feeds: feedViews,
cursor,
},
}
},
})
}
type Result = { likeCount: number; cid: string }
type LabeledResult = { primary: number; secondary: string }
export class LikeCountKeyset extends GenericKeyset<Result, LabeledResult> {
labelResult(result: Result) {
return {
primary: result.likeCount,
secondary: result.cid,
}
}
labeledResultToCursor(labeled: LabeledResult) {
return {
primary: labeled.primary.toString(),
secondary: labeled.secondary,
}
}
cursorToLabeledResult(cursor: { primary: string; secondary: string }) {
const likes = parseInt(cursor.primary, 10)
if (isNaN(likes)) {
throw new InvalidRequestError('Malformed cursor')
}
return {
primary: likes,
secondary: cursor.secondary,
}
}
}

@ -5,11 +5,12 @@ import AppContext from '../../../../context'
export default function (server: Server, ctx: AppContext) {
server.app.bsky.unspecced.getTaggedSuggestions({
handler: async () => {
const suggestions = await ctx.db
.getReplica()
.db.selectFrom('tagged_suggestion')
.selectAll()
.execute()
const res = await ctx.dataplane.getSuggestedEntities({})
const suggestions = res.entities.map((entity) => ({
tag: entity.tag,
subjectType: entity.subjectType,
subject: entity.subject,
}))
return {
encoding: 'application/json',
body: {

@ -1,19 +0,0 @@
import { TimeCidKeyset } from '../../../../db/pagination'
import { FeedRow } from '../../../../services/feed/types'
export enum FeedAlgorithm {
ReverseChronological = 'reverse-chronological',
}
export class FeedKeyset extends TimeCidKeyset<FeedRow> {
labelResult(result: FeedRow) {
return { primary: result.sortAt, secondary: result.cid }
}
}
// For users with sparse feeds, avoid scanning more than one week for a single page
export const getFeedDateThreshold = (from: string | undefined, days = 1) => {
const timelineDateThreshold = from ? new Date(from) : new Date()
timelineDateThreshold.setDate(timelineDateThreshold.getDate() - days)
return timelineDateThreshold.toISOString()
}

@ -5,11 +5,16 @@ import axios, { AxiosError } from 'axios'
import { CID } from 'multiformats/cid'
import { ensureValidDid } from '@atproto/syntax'
import { forwardStreamErrors, VerifyCidTransform } from '@atproto/common'
import { IdResolver, DidNotFoundError } from '@atproto/identity'
import { DidNotFoundError } from '@atproto/identity'
import AppContext from '../context'
import { httpLogger as log } from '../logger'
import { retryHttp } from '../util/retry'
import { Database } from '../db'
import {
Code,
getServiceEndpoint,
isDataplaneError,
unpackIdentityServices,
} from '../data-plane'
// Resolve and verify blob from its origin host
@ -31,8 +36,7 @@ export const createRouter = (ctx: AppContext): express.Router => {
return next(createError(400, 'Invalid cid'))
}
const db = ctx.db.getReplica()
const verifiedImage = await resolveBlob(did, cid, db, ctx.idResolver)
const verifiedImage = await resolveBlob(ctx, did, cid)
// Send chunked response, destroying stream early (before
// closing chunk) if the bytes don't match the expected cid.
@ -76,24 +80,29 @@ export const createRouter = (ctx: AppContext): express.Router => {
return router
}
export async function resolveBlob(
did: string,
cid: CID,
db: Database,
idResolver: IdResolver,
) {
export async function resolveBlob(ctx: AppContext, did: string, cid: CID) {
const cidStr = cid.toString()
const [{ pds }, takedown] = await Promise.all([
idResolver.did.resolveAtprotoData(did), // @TODO cache did info
db.db
.selectFrom('blob_takedown')
.select('takedownRef')
.where('did', '=', did)
.where('cid', '=', cid.toString())
.executeTakeFirst(),
const [identity, { takenDown }] = await Promise.all([
ctx.dataplane.getIdentityByDid({ did }).catch((err) => {
if (isDataplaneError(err, Code.NotFound)) {
return undefined
}
throw err
}),
ctx.dataplane.getBlobTakedown({ did, cid: cid.toString() }),
])
if (takedown) {
const services = identity && unpackIdentityServices(identity.services)
const pds =
services &&
getServiceEndpoint(services, {
id: 'atproto_pds',
type: 'AtprotoPersonalDataServer',
})
if (!pds) {
throw createError(404, 'Origin not found')
}
if (takenDown) {
throw createError(404, 'Blob not found')
}

@ -1,6 +1,5 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { Actor } from '../../../../db/tables/actor'
import { mapDefined } from '@atproto/common'
import { INVALID_HANDLE } from '@atproto/syntax'
@ -9,25 +8,16 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.roleOrAdminService,
handler: async ({ params }) => {
const { dids } = params
const db = ctx.db.getPrimary()
const actorService = ctx.services.actor(db)
const [actors, profiles] = await Promise.all([
actorService.getActors(dids, true),
actorService.getProfileRecords(dids, true),
])
const actorByDid = actors.reduce((acc, cur) => {
return acc.set(cur.did, cur)
}, new Map<string, Actor>())
const actors = await ctx.hydrator.actor.getActors(dids, true)
const infos = mapDefined(dids, (did) => {
const info = actorByDid.get(did)
const info = actors.get(did)
if (!info) return
const profile = profiles.get(did)
return {
did,
handle: info.handle ?? INVALID_HANDLE,
relatedRecords: profile ? [profile] : undefined,
indexedAt: info.indexedAt,
relatedRecords: info.profile ? [info.profile] : undefined,
indexedAt: (info.sortedAt ?? new Date(0)).toISOString(),
}
})

@ -8,7 +8,7 @@ export default function (server: Server, ctx: AppContext) {
auth: ctx.authVerifier.roleOrAdminService,
handler: async ({ params }) => {
const { did, uri, blob } = params
const modService = ctx.services.moderation(ctx.db.getPrimary())
let body: OutputSchema | null = null
if (blob) {
if (!did) {
@ -16,46 +16,48 @@ export default function (server: Server, ctx: AppContext) {
'Must provide a did to request blob state',
)
}
const takedown = await modService.getBlobTakedownRef(did, blob)
if (takedown) {
body = {
subject: {
$type: 'com.atproto.admin.defs#repoBlobRef',
did: did,
cid: blob,
},
takedown,
}
const res = await ctx.dataplane.getBlobTakedown({
did,
cid: blob,
})
body = {
subject: {
$type: 'com.atproto.admin.defs#repoBlobRef',
did: did,
cid: blob,
},
takedown: {
applied: res.takenDown,
ref: res.takedownRef ? 'TAKEDOWN' : undefined,
},
}
} else if (uri) {
const [takedown, cidRes] = await Promise.all([
modService.getRecordTakedownRef(uri),
ctx.db
.getPrimary()
.db.selectFrom('record')
.where('uri', '=', uri)
.select('cid')
.executeTakeFirst(),
])
if (cidRes && takedown) {
const res = await ctx.hydrator.getRecord(uri, true)
if (res) {
body = {
subject: {
$type: 'com.atproto.repo.strongRef',
uri,
cid: cidRes.cid,
cid: res.cid,
},
takedown: {
applied: !!res.takedownRef,
ref: res.takedownRef || undefined,
},
takedown,
}
}
} else if (did) {
const takedown = await modService.getRepoTakedownRef(did)
if (takedown) {
const res = (await ctx.hydrator.actor.getActors([did], true)).get(did)
if (res) {
body = {
subject: {
$type: 'com.atproto.admin.defs#repoRef',
did: did,
},
takedown,
takedown: {
applied: !!res.takedownRef,
ref: res.takedownRef || undefined,
},
}
}
} else {

@ -1,4 +1,5 @@
import { AtUri } from '@atproto/syntax'
import { Timestamp } from '@bufbuild/protobuf'
import { AuthRequiredError, InvalidRequestError } from '@atproto/xrpc-server'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import {
@ -6,8 +7,6 @@ import {
isRepoBlobRef,
} from '../../../../lexicon/types/com/atproto/admin/defs'
import { isMain as isStrongRef } from '../../../../lexicon/types/com/atproto/repo/strongRef'
import { AuthRequiredError, InvalidRequestError } from '@atproto/xrpc-server'
import { CID } from 'multiformats/cid'
export default function (server: Server, ctx: AppContext) {
server.com.atproto.admin.updateSubjectStatus({
@ -19,43 +18,49 @@ export default function (server: Server, ctx: AppContext) {
'Must be a full moderator to update subject state',
)
}
const modService = ctx.services.moderation(ctx.db.getPrimary())
const now = new Date()
const { subject, takedown } = input.body
if (takedown) {
if (isRepoRef(subject)) {
const did = subject.did
if (takedown.applied) {
await modService.takedownRepo({
takedownRef: takedown.ref ?? new Date().toISOString(),
did,
await ctx.dataplane.takedownActor({
did: subject.did,
ref: takedown.ref,
seen: Timestamp.fromDate(now),
})
} else {
await modService.reverseTakedownRepo({ did })
await ctx.dataplane.untakedownActor({
did: subject.did,
seen: Timestamp.fromDate(now),
})
}
} else if (isStrongRef(subject)) {
const uri = new AtUri(subject.uri)
const cid = CID.parse(subject.cid)
if (takedown.applied) {
await modService.takedownRecord({
takedownRef: takedown.ref ?? new Date().toISOString(),
uri,
cid,
await ctx.dataplane.takedownRecord({
recordUri: subject.uri,
ref: takedown.ref,
seen: Timestamp.fromDate(now),
})
} else {
await modService.reverseTakedownRecord({ uri })
await ctx.dataplane.untakedownRecord({
recordUri: subject.uri,
seen: Timestamp.fromDate(now),
})
}
} else if (isRepoBlobRef(subject)) {
const { did, cid } = subject
if (takedown.applied) {
await modService.takedownBlob({
takedownRef: takedown.ref ?? new Date().toISOString(),
did,
cid,
await ctx.dataplane.takedownBlob({
did: subject.did,
cid: subject.cid,
ref: takedown.ref,
seen: Timestamp.fromDate(now),
})
} else {
await modService.reverseTakedownBlob({ did, cid })
await ctx.dataplane.untakedownBlob({
did: subject.did,
cid: subject.cid,
seen: Timestamp.fromDate(now),
})
}
} else {
throw new InvalidRequestError('Invalid subject')

@ -7,12 +7,9 @@ export default function (server: Server, ctx: AppContext) {
server.com.atproto.identity.resolveHandle(async ({ req, params }) => {
const handle = ident.normalizeHandle(params.handle || req.hostname)
const db = ctx.db.getReplica()
let did: string | undefined
const user = await ctx.services.actor(db).getActor(handle, true)
if (user) {
did = user.did
} else {
let [did] = await ctx.hydrator.actor.getDids([handle])
if (!did) {
const publicHostname = ctx.cfg.publicUrl
? new URL(ctx.cfg.publicUrl).hostname
: null

@ -2,37 +2,28 @@ import { InvalidRequestError } from '@atproto/xrpc-server'
import { AtUri } from '@atproto/syntax'
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { jsonStringToLex } from '@atproto/lexicon'
export default function (server: Server, ctx: AppContext) {
server.com.atproto.repo.getRecord(async ({ params }) => {
const { repo, collection, rkey, cid } = params
const db = ctx.db.getReplica()
const did = await ctx.services.actor(db).getActorDid(repo)
const [did] = await ctx.hydrator.actor.getDids([repo])
if (!did) {
throw new InvalidRequestError(`Could not find repo: ${repo}`)
}
const uri = AtUri.make(did, collection, rkey)
const uri = AtUri.make(did, collection, rkey).toString()
const result = await ctx.hydrator.getRecord(uri, true)
let builder = db.db
.selectFrom('record')
.selectAll()
.where('uri', '=', uri.toString())
if (cid) {
builder = builder.where('cid', '=', cid)
}
const record = await builder.executeTakeFirst()
if (!record) {
if (!result || (cid && result.cid !== cid)) {
throw new InvalidRequestError(`Could not locate record: ${uri}`)
}
return {
encoding: 'application/json',
encoding: 'application/json' as const,
body: {
uri: record.uri,
cid: record.cid,
value: jsonStringToLex(record.json) as Record<string, unknown>,
uri: uri,
cid: result.cid,
value: result.record,
},
}
})

@ -1,30 +1,9 @@
import { Server } from '../../../../lexicon'
import AppContext from '../../../../context'
import { InvalidRequestError } from '@atproto/xrpc-server'
export default function (server: Server, ctx: AppContext) {
server.com.atproto.temp.fetchLabels(async ({ params }) => {
const { limit } = params
const db = ctx.db.getReplica()
const since =
params.since !== undefined ? new Date(params.since).toISOString() : ''
const labelRes = await db.db
.selectFrom('label')
.selectAll()
.orderBy('label.cts', 'asc')
.where('cts', '>', since)
.limit(limit)
.execute()
const labels = labelRes.map((l) => ({
...l,
cid: l.cid === '' ? undefined : l.cid,
}))
return {
encoding: 'application/json',
body: {
labels,
},
}
export default function (server: Server, _ctx: AppContext) {
server.com.atproto.temp.fetchLabels(async (_reqCtx) => {
throw new InvalidRequestError('not implemented on dataplane')
})
}

@ -1,5 +1,4 @@
import express from 'express'
import { sql } from 'kysely'
import AppContext from '../context'
export const createRouter = (ctx: AppContext): express.Router => {
@ -21,9 +20,8 @@ export const createRouter = (ctx: AppContext): express.Router => {
router.get('/xrpc/_health', async function (req, res) {
const { version } = ctx.cfg
const db = ctx.db.getPrimary()
try {
await sql`select 1`.execute(db.db)
await ctx.dataplane.ping({})
} catch (err) {
req.log.error(err, 'failed health check')
return res.status(503).send({ version, error: 'Service Unavailable' })

@ -5,3 +5,8 @@ export const setRepoRev = (res: express.Response, rev: string | null) => {
res.setHeader('Atproto-Repo-Rev', rev)
}
}
export const clearlyBadCursor = (cursor?: string) => {
// hallmark of v1 cursor, highly unlikely in v2 cursors based on time or rkeys
return !!cursor?.includes('::')
}

@ -2,9 +2,16 @@ import {
AuthRequiredError,
verifyJwt as verifyServiceJwt,
} from '@atproto/xrpc-server'
import { IdResolver } from '@atproto/identity'
import * as ui8 from 'uint8arrays'
import express from 'express'
import {
Code,
DataPlaneClient,
getKeyAsDidKey,
isDataplaneError,
unpackIdentityKeys,
} from './data-plane'
import { GetIdentityByDidResponse } from './proto/bsky_pb'
type ReqCtx = {
req: express.Request
@ -35,8 +42,6 @@ type RoleOutput = {
credentials: {
type: 'role'
admin: boolean
moderator: boolean
triage: boolean
}
}
@ -51,29 +56,35 @@ type AdminServiceOutput = {
export type AuthVerifierOpts = {
ownDid: string
adminDid: string
adminPass: string
moderatorPass: string
triagePass: string
adminPasses: string[]
}
export class AuthVerifier {
private _adminPass: string
private _moderatorPass: string
private _triagePass: string
public ownDid: string
public adminDid: string
private adminPasses: Set<string>
constructor(public idResolver: IdResolver, opts: AuthVerifierOpts) {
this._adminPass = opts.adminPass
this._moderatorPass = opts.moderatorPass
this._triagePass = opts.triagePass
constructor(public dataplane: DataPlaneClient, opts: AuthVerifierOpts) {
this.ownDid = opts.ownDid
this.adminDid = opts.adminDid
this.adminPasses = new Set(opts.adminPasses)
}
// verifiers (arrow fns to preserve scope)
standard = async (ctx: ReqCtx): Promise<StandardOutput> => {
// @TODO remove! basic auth + did supported just for testing.
if (isBasicToken(ctx.req)) {
const aud = this.ownDid
const iss = ctx.req.headers['appview-as-did']
if (typeof iss !== 'string' || !iss.startsWith('did:')) {
throw new AuthRequiredError('bad issuer')
}
if (!this.parseRoleCreds(ctx.req).admin) {
throw new AuthRequiredError('bad credentials')
}
return { credentials: { type: 'standard', iss, aud } }
}
const { iss, aud } = await this.verifyServiceJwt(ctx, {
aud: this.ownDid,
iss: null,
@ -84,7 +95,7 @@ export class AuthVerifier {
standardOptional = async (
ctx: ReqCtx,
): Promise<StandardOutput | NullOutput> => {
if (isBearerToken(ctx.req)) {
if (isBearerToken(ctx.req) || isBasicToken(ctx.req)) {
return this.standard(ctx)
}
return this.nullCreds()
@ -173,16 +184,10 @@ export class AuthVerifier {
return { status: Missing, admin: false, moderator: false, triage: false }
}
const { username, password } = parsed
if (username === 'admin' && password === this._adminPass) {
return { status: Valid, admin: true, moderator: true, triage: true }
if (username === 'admin' && this.adminPasses.has(password)) {
return { status: Valid, admin: true }
}
if (username === 'admin' && password === this._moderatorPass) {
return { status: Valid, admin: false, moderator: true, triage: true }
}
if (username === 'admin' && password === this._triagePass) {
return { status: Valid, admin: false, moderator: false, triage: true }
}
return { status: Invalid, admin: false, moderator: false, triage: false }
return { status: Invalid, admin: false }
}
async verifyServiceJwt(
@ -191,12 +196,26 @@ export class AuthVerifier {
) {
const getSigningKey = async (
did: string,
forceRefresh: boolean,
_forceRefresh: boolean, // @TODO consider propagating to dataplane
): Promise<string> => {
if (opts.iss !== null && !opts.iss.includes(did)) {
throw new AuthRequiredError('Untrusted issuer', 'UntrustedIss')
}
return this.idResolver.did.resolveAtprotoKey(did, forceRefresh)
let identity: GetIdentityByDidResponse
try {
identity = await this.dataplane.getIdentityByDid({ did })
} catch (err) {
if (isDataplaneError(err, Code.NotFound)) {
throw new AuthRequiredError('identity unknown')
}
throw err
}
const keys = unpackIdentityKeys(identity.keys)
const didKey = getKeyAsDidKey(keys, { id: 'atproto' })
if (!didKey) {
throw new AuthRequiredError('missing or bad key')
}
return didKey
}
const jwtStr = bearerTokenFromReq(reqCtx.req)
@ -222,10 +241,10 @@ export class AuthVerifier {
const viewer =
creds.credentials.type === 'standard' ? creds.credentials.iss : null
const canViewTakedowns =
(creds.credentials.type === 'role' && creds.credentials.triage) ||
(creds.credentials.type === 'role' && creds.credentials.admin) ||
creds.credentials.type === 'admin_service'
const canPerformTakedown =
(creds.credentials.type === 'role' && creds.credentials.moderator) ||
(creds.credentials.type === 'role' && creds.credentials.admin) ||
creds.credentials.type === 'admin_service'
return {
viewer,
@ -245,6 +264,10 @@ const isBearerToken = (req: express.Request): boolean => {
return req.headers.authorization?.startsWith(BEARER) ?? false
}
const isBasicToken = (req: express.Request): boolean => {
return req.headers.authorization?.startsWith(BASIC) ?? false
}
const bearerTokenFromReq = (req: express.Request) => {
const header = req.headers.authorization || ''
if (!header.startsWith(BEARER)) return null

@ -1,187 +0,0 @@
import axios from 'axios'
import FormData from 'form-data'
import { CID } from 'multiformats/cid'
import { IdResolver } from '@atproto/identity'
import { PrimaryDatabase } from '../db'
import { retryHttp } from '../util/retry'
import { resolveBlob } from '../api/blob-resolver'
import { labelerLogger as log } from '../logger'
const HIVE_ENDPOINT = 'https://api.thehive.ai/api/v2/task/sync'
export interface ImgLabeler {
labelImg(did: string, cid: CID): Promise<string[]>
}
export class HiveLabeler implements ImgLabeler {
constructor(
public hiveApiKey: string,
protected ctx: {
db: PrimaryDatabase
idResolver: IdResolver
},
) {}
async labelImg(did: string, cid: CID): Promise<string[]> {
const hiveRes = await retryHttp(async () => {
try {
return await this.makeHiveReq(did, cid)
} catch (err) {
log.warn({ err, did, cid: cid.toString() }, 'hive request failed')
throw err
}
})
log.info({ hiveRes, did, cid: cid.toString() }, 'hive response')
const classes = respToClasses(hiveRes)
return summarizeLabels(classes)
}
async makeHiveReq(did: string, cid: CID): Promise<HiveResp> {
const { stream } = await resolveBlob(
did,
cid,
this.ctx.db,
this.ctx.idResolver,
)
const form = new FormData()
form.append('media', stream)
const { data } = await axios.post(HIVE_ENDPOINT, form, {
headers: {
'Content-Type': 'multipart/form-data',
authorization: `token ${this.hiveApiKey}`,
accept: 'application/json',
},
})
return data
}
}
export const respToClasses = (res: HiveResp): HiveRespClass[] => {
const classes: HiveRespClass[] = []
for (const status of res.status) {
for (const out of status.response.output) {
for (const cls of out.classes) {
classes.push(cls)
}
}
}
return classes
}
// Matches only one (or none) of: porn, sexual, nudity
//
// porn: sexual and nudity. including both explicit activity or full-frontal and suggestive/intent
// sexual: sexually suggestive, not explicit; may include some forms of nudity
// nudity: non-sexual nudity (eg, artistic, possibly some photographic)
//
// hive docs/definitions: https://docs.thehive.ai/docs/sexual-content
export const sexualLabels = (classes: HiveRespClass[]): string[] => {
const scores = {}
for (const cls of classes) {
scores[cls.class] = cls.score
}
// first check if porn...
for (const pornClass of [
'yes_sexual_activity',
'animal_genitalia_and_human',
'yes_realistic_nsfw',
]) {
if (scores[pornClass] >= 0.9) {
return ['porn']
}
}
if (scores['general_nsfw'] >= 0.9) {
// special case for some anime examples
if (scores['animated_animal_genitalia'] >= 0.5) {
return ['porn']
}
// special case for some pornographic/explicit classic drawings
if (scores['yes_undressed'] >= 0.9 && scores['yes_sexual_activity'] > 0.9) {
return ['porn']
}
}
// then check for sexual suggestive (which may include nudity)...
for (const sexualClass of ['yes_sexual_intent', 'yes_sex_toy']) {
if (scores[sexualClass] >= 0.9) {
return ['sexual']
}
}
if (scores['yes_undressed'] >= 0.9) {
// special case for bondage examples
if (scores['yes_sex_toy'] > 0.75) {
return ['sexual']
}
}
// then non-sexual nudity...
for (const nudityClass of [
'yes_male_nudity',
'yes_female_nudity',
'yes_undressed',
]) {
if (scores[nudityClass] >= 0.9) {
return ['nudity']
}
}
// then finally flag remaining "underwear" images in to sexually suggestive
// (after non-sexual content already labeled above)
for (const nudityClass of ['yes_male_underwear', 'yes_female_underwear']) {
if (scores[nudityClass] >= 0.9) {
// TODO: retaining 'underwear' label for a short time to help understand
// the impact of labeling all "underwear" as "sexual". This *will* be
// pulling in somewhat non-sexual content in to "sexual" label.
return ['sexual']
}
}
return []
}
// gore and violence: https://docs.thehive.ai/docs/class-descriptions-violence-gore
const labelForClass = {
very_bloody: 'gore',
human_corpse: 'corpse',
hanging: 'corpse',
}
const labelForClassLessSensitive = {
yes_self_harm: 'self-harm',
}
export const summarizeLabels = (classes: HiveRespClass[]): string[] => {
const labels: string[] = sexualLabels(classes)
for (const cls of classes) {
if (labelForClass[cls.class] && cls.score >= 0.9) {
labels.push(labelForClass[cls.class])
}
}
for (const cls of classes) {
if (labelForClassLessSensitive[cls.class] && cls.score >= 0.96) {
labels.push(labelForClassLessSensitive[cls.class])
}
}
return labels
}
type HiveResp = {
status: HiveRespStatus[]
}
type HiveRespStatus = {
response: {
output: HiveRespOutput[]
}
}
type HiveRespOutput = {
time: number
classes: HiveRespClass[]
}
type HiveRespClass = {
class: string
score: number
}

@ -1,91 +0,0 @@
import { AtUri } from '@atproto/syntax'
import { AtpAgent } from '@atproto/api'
import { dedupe, getFieldsFromRecord } from './util'
import { labelerLogger as log } from '../logger'
import { PrimaryDatabase } from '../db'
import { IdResolver } from '@atproto/identity'
import { BackgroundQueue } from '../background'
import { IndexerConfig } from '../indexer/config'
import { buildBasicAuth } from '../auth-verifier'
import { CID } from 'multiformats/cid'
import { HiveLabeler, ImgLabeler } from './hive'
import { KeywordLabeler, TextLabeler } from './keyword'
import { ids } from '../lexicon/lexicons'
export class AutoModerator {
public pushAgent: AtpAgent
public imgLabeler?: ImgLabeler
public textLabeler?: TextLabeler
constructor(
public ctx: {
db: PrimaryDatabase
idResolver: IdResolver
cfg: IndexerConfig
backgroundQueue: BackgroundQueue
},
) {
const { hiveApiKey } = ctx.cfg
this.imgLabeler = hiveApiKey ? new HiveLabeler(hiveApiKey, ctx) : undefined
this.textLabeler = new KeywordLabeler(ctx.cfg.labelerKeywords)
const url = new URL(ctx.cfg.moderationPushUrl)
this.pushAgent = new AtpAgent({ service: url.origin })
this.pushAgent.api.setHeader(
'authorization',
buildBasicAuth(url.username, url.password),
)
}
processRecord(uri: AtUri, cid: CID, obj: unknown) {
this.ctx.backgroundQueue.add(async () => {
const { text, imgs } = getFieldsFromRecord(obj, uri)
await this.labelRecord(uri, cid, text, imgs).catch((err) => {
log.error(
{ err, uri: uri.toString(), record: obj },
'failed to label record',
)
})
})
}
processHandle(_handle: string, _did: string) {
// no-op since this functionality moved to auto-mod service
}
async labelRecord(uri: AtUri, recordCid: CID, text: string[], imgs: CID[]) {
if (uri.collection !== ids.AppBskyFeedPost) {
// @TODO label profiles
return
}
const allLabels = await Promise.all([
this.textLabeler?.labelText(text.join(' ')),
...imgs.map((cid) => this.imgLabeler?.labelImg(uri.host, cid)),
])
const labels = dedupe(allLabels.flat())
await this.pushLabels(uri, recordCid, labels)
}
async pushLabels(uri: AtUri, cid: CID, labels: string[]): Promise<void> {
if (labels.length < 1) return
await this.pushAgent.com.atproto.admin.emitModerationEvent({
event: {
$type: 'com.atproto.admin.defs#modEventLabel',
comment: '[AutoModerator]: Applying labels',
createLabelVals: labels,
negateLabelVals: [],
},
subject: {
$type: 'com.atproto.repo.strongRef',
uri: uri.toString(),
cid: cid.toString(),
},
createdBy: this.ctx.cfg.serverDid,
})
}
async processAll() {
await this.ctx.backgroundQueue.processAll()
}
}

@ -1,25 +0,0 @@
export interface TextLabeler {
labelText(text: string): Promise<string[]>
}
export class KeywordLabeler implements TextLabeler {
constructor(public keywords: Record<string, string>) {}
async labelText(text: string): Promise<string[]> {
return keywordLabeling(this.keywords, text)
}
}
export const keywordLabeling = (
keywords: Record<string, string>,
text: string,
): string[] => {
const lowerText = text.toLowerCase()
const labels: string[] = []
for (const word of Object.keys(keywords)) {
if (lowerText.includes(word)) {
labels.push(keywords[word])
}
}
return labels
}

@ -1,138 +0,0 @@
import { CID } from 'multiformats/cid'
import { AtUri } from '@atproto/syntax'
import * as lex from '../lexicon/lexicons'
import {
isRecord as isPost,
Record as PostRecord,
} from '../lexicon/types/app/bsky/feed/post'
import {
isRecord as isProfile,
Record as ProfileRecord,
} from '../lexicon/types/app/bsky/actor/profile'
import {
isRecord as isList,
Record as ListRecord,
} from '../lexicon/types/app/bsky/graph/list'
import {
isRecord as isGenerator,
Record as GeneratorRecord,
} from '../lexicon/types/app/bsky/feed/generator'
import { isMain as isEmbedImage } from '../lexicon/types/app/bsky/embed/images'
import { isMain as isEmbedExternal } from '../lexicon/types/app/bsky/embed/external'
import { isMain as isEmbedRecordWithMedia } from '../lexicon/types/app/bsky/embed/recordWithMedia'
type RecordFields = {
text: string[]
imgs: CID[]
}
export const getFieldsFromRecord = (
record: unknown,
uri: AtUri,
): RecordFields => {
if (isPost(record)) {
return getFieldsFromPost(record)
} else if (isProfile(record)) {
return getFieldsFromProfile(record)
} else if (isList(record)) {
return getFieldsFromList(record)
} else if (isGenerator(record)) {
return getFieldsFromGenerator(record, uri)
} else {
return { text: [], imgs: [] }
}
}
export const getFieldsFromPost = (record: PostRecord): RecordFields => {
const text: string[] = []
const imgs: CID[] = []
text.push(record.text)
const embeds = separateEmbeds(record.embed)
for (const embed of embeds) {
if (isEmbedImage(embed)) {
for (const img of embed.images) {
imgs.push(img.image.ref)
text.push(img.alt)
}
} else if (isEmbedExternal(embed)) {
if (embed.external.thumb) {
imgs.push(embed.external.thumb.ref)
}
text.push(embed.external.title)
text.push(embed.external.description)
}
}
return { text, imgs }
}
export const getFieldsFromProfile = (record: ProfileRecord): RecordFields => {
const text: string[] = []
const imgs: CID[] = []
if (record.displayName) {
text.push(record.displayName)
}
if (record.description) {
text.push(record.description)
}
if (record.avatar) {
imgs.push(record.avatar.ref)
}
if (record.banner) {
imgs.push(record.banner.ref)
}
return { text, imgs }
}
export const getFieldsFromList = (record: ListRecord): RecordFields => {
const text: string[] = []
const imgs: CID[] = []
if (record.name) {
text.push(record.name)
}
if (record.description) {
text.push(record.description)
}
if (record.avatar) {
imgs.push(record.avatar.ref)
}
return { text, imgs }
}
export const getFieldsFromGenerator = (
record: GeneratorRecord,
uri: AtUri,
): RecordFields => {
const text: string[] = []
const imgs: CID[] = []
text.push(uri.rkey)
if (record.displayName) {
text.push(record.displayName)
}
if (record.description) {
text.push(record.description)
}
if (record.avatar) {
imgs.push(record.avatar.ref)
}
return { text, imgs }
}
export const dedupe = (strs: (string | undefined)[]): string[] => {
const set = new Set<string>()
for (const str of strs) {
if (str !== undefined) {
set.add(str)
}
}
return [...set]
}
const separateEmbeds = (embed: PostRecord['embed']) => {
if (!embed) {
return []
}
if (isEmbedRecordWithMedia(embed)) {
return [{ $type: lex.ids.AppBskyEmbedRecord, ...embed.record }, embed.media]
}
return [embed]
}

@ -1,53 +1,35 @@
import assert from 'assert'
import {
DAY,
HOUR,
MINUTE,
SECOND,
parseIntWithFallback,
} from '@atproto/common'
import assert from 'node:assert'
export interface ServerConfigValues {
version: string
// service
version?: string
debugMode?: boolean
port?: number
publicUrl?: string
serverDid: string
feedGenDid?: string
dbPrimaryPostgresUrl: string
dbReplicaPostgresUrls?: string[]
dbReplicaTags?: Record<string, number[]> // E.g. { timeline: [0], thread: [1] }
dbPostgresSchema?: string
redisHost?: string // either set redis host, or both sentinel name and hosts
redisSentinelName?: string
redisSentinelHosts?: string[]
redisPassword?: string
didPlcUrl: string
didCacheStaleTTL: number
didCacheMaxTTL: number
labelCacheStaleTTL: number
labelCacheMaxTTL: number
handleResolveNameservers?: string[]
imgUriEndpoint?: string
blobCacheLocation?: string
searchEndpoint?: string
bsyncUrl?: string
// external services
dataplaneUrls: string[]
dataplaneHttpVersion?: '1.1' | '2'
dataplaneIgnoreBadTls?: boolean
bsyncUrl: string
bsyncApiKey?: string
bsyncHttpVersion?: '1.1' | '2'
bsyncIgnoreBadTls?: boolean
bsyncOnlyMutes?: boolean
courierUrl?: string
courierUrl: string
courierApiKey?: string
courierHttpVersion?: '1.1' | '2'
courierIgnoreBadTls?: boolean
courierOnlyRegistration?: boolean
adminPassword: string
moderatorPassword: string
triagePassword: string
searchUrl?: string
cdnUrl?: string
// identity
didPlcUrl: string
handleResolveNameservers?: string[]
// moderation and administration
modServiceDid: string
rateLimitsEnabled: boolean
rateLimitBypassKey?: string
rateLimitBypassIps?: string[]
adminPasswords: string[]
labelsFromIssuerDids?: string[]
// misc/dev
blobCacheLocation?: string
}
export class ServerConfig {
@ -55,149 +37,77 @@ export class ServerConfig {
constructor(private cfg: ServerConfigValues) {}
static readEnv(overrides?: Partial<ServerConfigValues>) {
const version = process.env.BSKY_VERSION || '0.0.0'
const version = process.env.BSKY_VERSION || undefined
const debugMode = process.env.NODE_ENV !== 'production'
const publicUrl = process.env.PUBLIC_URL || undefined
const serverDid = process.env.SERVER_DID || 'did:example:test'
const feedGenDid = process.env.FEED_GEN_DID
const envPort = parseInt(process.env.PORT || '', 10)
const publicUrl = process.env.BSKY_PUBLIC_URL || undefined
const serverDid = process.env.BSKY_SERVER_DID || 'did:example:test'
const envPort = parseInt(process.env.BSKY_PORT || '', 10)
const port = isNaN(envPort) ? 2584 : envPort
const redisHost =
overrides?.redisHost || process.env.REDIS_HOST || undefined
const redisSentinelName =
overrides?.redisSentinelName ||
process.env.REDIS_SENTINEL_NAME ||
undefined
const redisSentinelHosts =
overrides?.redisSentinelHosts ||
(process.env.REDIS_SENTINEL_HOSTS
? process.env.REDIS_SENTINEL_HOSTS.split(',')
: [])
const redisPassword =
overrides?.redisPassword || process.env.REDIS_PASSWORD || undefined
const didPlcUrl = process.env.DID_PLC_URL || 'http://localhost:2582'
const didCacheStaleTTL = parseIntWithFallback(
process.env.DID_CACHE_STALE_TTL,
HOUR,
)
const didCacheMaxTTL = parseIntWithFallback(
process.env.DID_CACHE_MAX_TTL,
DAY,
)
const labelCacheStaleTTL = parseIntWithFallback(
process.env.LABEL_CACHE_STALE_TTL,
30 * SECOND,
)
const labelCacheMaxTTL = parseIntWithFallback(
process.env.LABEL_CACHE_MAX_TTL,
MINUTE,
)
const handleResolveNameservers = process.env.HANDLE_RESOLVE_NAMESERVERS
? process.env.HANDLE_RESOLVE_NAMESERVERS.split(',')
const didPlcUrl = process.env.BSKY_DID_PLC_URL || 'http://localhost:2582'
const handleResolveNameservers = process.env.BSKY_HANDLE_RESOLVE_NAMESERVERS
? process.env.BSKY_HANDLE_RESOLVE_NAMESERVERS.split(',')
: []
const cdnUrl = process.env.BSKY_CDN_URL || process.env.BSKY_IMG_URI_ENDPOINT
const blobCacheLocation = process.env.BSKY_BLOB_CACHE_LOC
const searchUrl =
process.env.BSKY_SEARCH_URL ||
process.env.BSKY_SEARCH_ENDPOINT ||
undefined
let dataplaneUrls = overrides?.dataplaneUrls
dataplaneUrls ??= process.env.BSKY_DATAPLANE_URLS
? process.env.BSKY_DATAPLANE_URLS.split(',')
: []
const dataplaneHttpVersion = process.env.BSKY_DATAPLANE_HTTP_VERSION || '2'
const dataplaneIgnoreBadTls =
process.env.BSKY_DATAPLANE_IGNORE_BAD_TLS === 'true'
const labelsFromIssuerDids = process.env.BSKY_LABELS_FROM_ISSUER_DIDS
? process.env.BSKY_LABELS_FROM_ISSUER_DIDS.split(',')
: []
const imgUriEndpoint = process.env.IMG_URI_ENDPOINT
const blobCacheLocation = process.env.BLOB_CACHE_LOC
const searchEndpoint = process.env.SEARCH_ENDPOINT
const bsyncUrl = process.env.BSKY_BSYNC_URL || undefined
assert(bsyncUrl)
const bsyncApiKey = process.env.BSKY_BSYNC_API_KEY || undefined
const bsyncHttpVersion = process.env.BSKY_BSYNC_HTTP_VERSION || '2'
const bsyncIgnoreBadTls = process.env.BSKY_BSYNC_IGNORE_BAD_TLS === 'true'
const bsyncOnlyMutes = process.env.BSKY_BSYNC_ONLY_MUTES === 'true'
assert(!bsyncOnlyMutes || bsyncUrl, 'bsync-only mutes requires a bsync url')
assert(bsyncHttpVersion === '1.1' || bsyncHttpVersion === '2')
const courierUrl = process.env.BSKY_COURIER_URL || undefined
assert(courierUrl)
const courierApiKey = process.env.BSKY_COURIER_API_KEY || undefined
const courierHttpVersion = process.env.BSKY_COURIER_HTTP_VERSION || '2'
const courierIgnoreBadTls =
process.env.BSKY_COURIER_IGNORE_BAD_TLS === 'true'
const courierOnlyRegistration =
process.env.BSKY_COURIER_ONLY_REGISTRATION === 'true'
assert(
!courierOnlyRegistration || courierUrl,
'courier-only registration requires a courier url',
)
assert(courierHttpVersion === '1.1' || courierHttpVersion === '2')
const dbPrimaryPostgresUrl =
overrides?.dbPrimaryPostgresUrl || process.env.DB_PRIMARY_POSTGRES_URL
let dbReplicaPostgresUrls = overrides?.dbReplicaPostgresUrls
if (!dbReplicaPostgresUrls && process.env.DB_REPLICA_POSTGRES_URLS) {
dbReplicaPostgresUrls = process.env.DB_REPLICA_POSTGRES_URLS.split(',')
}
const dbReplicaTags = overrides?.dbReplicaTags ?? {
'*': getTagIdxs(process.env.DB_REPLICA_TAGS_ANY), // e.g. DB_REPLICA_TAGS_ANY=0,1
timeline: getTagIdxs(process.env.DB_REPLICA_TAGS_TIMELINE),
feed: getTagIdxs(process.env.DB_REPLICA_TAGS_FEED),
search: getTagIdxs(process.env.DB_REPLICA_TAGS_SEARCH),
thread: getTagIdxs(process.env.DB_REPLICA_TAGS_THREAD),
}
assert(
Object.values(dbReplicaTags)
.flat()
.every((idx) => idx < (dbReplicaPostgresUrls?.length ?? 0)),
'out of range index in replica tags',
const adminPasswords = envList(
process.env.BSKY_ADMIN_PASSWORDS || process.env.BSKY_ADMIN_PASSWORD,
)
const dbPostgresSchema = process.env.DB_POSTGRES_SCHEMA
assert(dbPrimaryPostgresUrl)
const adminPassword = process.env.ADMIN_PASSWORD || undefined
assert(adminPassword)
const moderatorPassword = process.env.MODERATOR_PASSWORD || undefined
assert(moderatorPassword)
const triagePassword = process.env.TRIAGE_PASSWORD || undefined
assert(triagePassword)
const modServiceDid =
overrides?.modServiceDid ||
process.env.MODERATION_SERVICE_DID ||
undefined
const modServiceDid = process.env.MOD_SERVICE_DID
assert(modServiceDid)
const rateLimitsEnabled = process.env.RATE_LIMITS_ENABLED === 'true'
const rateLimitBypassKey = process.env.RATE_LIMIT_BYPASS_KEY
const rateLimitBypassIps = process.env.RATE_LIMIT_BYPASS_IPS
? process.env.RATE_LIMIT_BYPASS_IPS.split(',').map((ipOrCidr) =>
ipOrCidr.split('/')[0]?.trim(),
)
: undefined
assert(dataplaneUrls.length)
assert(dataplaneHttpVersion === '1.1' || dataplaneHttpVersion === '2')
return new ServerConfig({
version,
debugMode,
port,
publicUrl,
serverDid,
feedGenDid,
dbPrimaryPostgresUrl,
dbReplicaPostgresUrls,
dbReplicaTags,
dbPostgresSchema,
redisHost,
redisSentinelName,
redisSentinelHosts,
redisPassword,
dataplaneUrls,
dataplaneHttpVersion,
dataplaneIgnoreBadTls,
searchUrl,
didPlcUrl,
didCacheStaleTTL,
didCacheMaxTTL,
labelCacheStaleTTL,
labelCacheMaxTTL,
labelsFromIssuerDids,
handleResolveNameservers,
imgUriEndpoint,
cdnUrl,
blobCacheLocation,
searchEndpoint,
bsyncUrl,
bsyncApiKey,
bsyncHttpVersion,
bsyncIgnoreBadTls,
bsyncOnlyMutes,
courierUrl,
courierApiKey,
courierHttpVersion,
courierIgnoreBadTls,
courierOnlyRegistration,
adminPassword,
moderatorPassword,
triagePassword,
adminPasswords,
modServiceDid,
rateLimitsEnabled,
rateLimitBypassKey,
rateLimitBypassIps,
...stripUndefineds(overrides ?? {}),
})
}
@ -235,76 +145,16 @@ export class ServerConfig {
return this.cfg.serverDid
}
get feedGenDid() {
return this.cfg.feedGenDid
get dataplaneUrls() {
return this.cfg.dataplaneUrls
}
get dbPrimaryPostgresUrl() {
return this.cfg.dbPrimaryPostgresUrl
get dataplaneHttpVersion() {
return this.cfg.dataplaneHttpVersion
}
get dbReplicaPostgresUrl() {
return this.cfg.dbReplicaPostgresUrls
}
get dbReplicaTags() {
return this.cfg.dbReplicaTags
}
get dbPostgresSchema() {
return this.cfg.dbPostgresSchema
}
get redisHost() {
return this.cfg.redisHost
}
get redisSentinelName() {
return this.cfg.redisSentinelName
}
get redisSentinelHosts() {
return this.cfg.redisSentinelHosts
}
get redisPassword() {
return this.cfg.redisPassword
}
get didCacheStaleTTL() {
return this.cfg.didCacheStaleTTL
}
get didCacheMaxTTL() {
return this.cfg.didCacheMaxTTL
}
get labelCacheStaleTTL() {
return this.cfg.labelCacheStaleTTL
}
get labelCacheMaxTTL() {
return this.cfg.labelCacheMaxTTL
}
get handleResolveNameservers() {
return this.cfg.handleResolveNameservers
}
get didPlcUrl() {
return this.cfg.didPlcUrl
}
get imgUriEndpoint() {
return this.cfg.imgUriEndpoint
}
get blobCacheLocation() {
return this.cfg.blobCacheLocation
}
get searchEndpoint() {
return this.cfg.searchEndpoint
get dataplaneIgnoreBadTls() {
return this.cfg.dataplaneIgnoreBadTls
}
get bsyncUrl() {
@ -315,10 +165,6 @@ export class ServerConfig {
return this.cfg.bsyncApiKey
}
get bsyncOnlyMutes() {
return this.cfg.bsyncOnlyMutes
}
get bsyncHttpVersion() {
return this.cfg.bsyncHttpVersion
}
@ -343,41 +189,37 @@ export class ServerConfig {
return this.cfg.courierIgnoreBadTls
}
get courierOnlyRegistration() {
return this.cfg.courierOnlyRegistration
get searchUrl() {
return this.cfg.searchUrl
}
get adminPassword() {
return this.cfg.adminPassword
get cdnUrl() {
return this.cfg.cdnUrl
}
get moderatorPassword() {
return this.cfg.moderatorPassword
get didPlcUrl() {
return this.cfg.didPlcUrl
}
get triagePassword() {
return this.cfg.triagePassword
get handleResolveNameservers() {
return this.cfg.handleResolveNameservers
}
get adminPasswords() {
return this.cfg.adminPasswords
}
get modServiceDid() {
return this.cfg.modServiceDid
}
get rateLimitsEnabled() {
return this.cfg.rateLimitsEnabled
get labelsFromIssuerDids() {
return this.cfg.labelsFromIssuerDids ?? []
}
get rateLimitBypassKey() {
return this.cfg.rateLimitBypassKey
get blobCacheLocation() {
return this.cfg.blobCacheLocation
}
get rateLimitBypassIps() {
return this.cfg.rateLimitBypassIps
}
}
function getTagIdxs(str?: string): number[] {
return str ? str.split(',').map((item) => parseInt(item, 10)) : []
}
function stripUndefineds(
@ -391,3 +233,8 @@ function stripUndefineds(
})
return result
}
function envList(str: string | undefined): string[] {
if (str === undefined || str.length === 0) return []
return str.split(',')
}

@ -1,15 +1,12 @@
import * as plc from '@did-plc/lib'
import { IdResolver } from '@atproto/identity'
import { AtpAgent } from '@atproto/api'
import AtpAgent from '@atproto/api'
import { Keypair } from '@atproto/crypto'
import { createServiceJwt } from '@atproto/xrpc-server'
import { DatabaseCoordinator } from './db'
import { ServerConfig } from './config'
import { ImageUriBuilder } from './image/uri'
import { Services } from './services'
import DidRedisCache from './did-cache'
import { BackgroundQueue } from './background'
import { Redis } from './redis'
import { DataPlaneClient } from './data-plane/client'
import { Hydrator } from './hydration/hydrator'
import { Views } from './views'
import { AuthVerifier } from './auth-verifier'
import { BsyncClient } from './bsync'
import { CourierClient } from './courier'
@ -17,36 +14,37 @@ import { CourierClient } from './courier'
export class AppContext {
constructor(
private opts: {
db: DatabaseCoordinator
imgUriBuilder: ImageUriBuilder
cfg: ServerConfig
services: Services
dataplane: DataPlaneClient
searchAgent: AtpAgent | undefined
hydrator: Hydrator
views: Views
signingKey: Keypair
idResolver: IdResolver
didCache: DidRedisCache
redis: Redis
backgroundQueue: BackgroundQueue
searchAgent?: AtpAgent
bsyncClient?: BsyncClient
courierClient?: CourierClient
bsyncClient: BsyncClient
courierClient: CourierClient
authVerifier: AuthVerifier
},
) {}
get db(): DatabaseCoordinator {
return this.opts.db
}
get imgUriBuilder(): ImageUriBuilder {
return this.opts.imgUriBuilder
}
get cfg(): ServerConfig {
return this.opts.cfg
}
get services(): Services {
return this.opts.services
get dataplane(): DataPlaneClient {
return this.opts.dataplane
}
get searchAgent(): AtpAgent | undefined {
return this.opts.searchAgent
}
get hydrator(): Hydrator {
return this.opts.hydrator
}
get views(): Views {
return this.opts.views
}
get signingKey(): Keypair {
@ -61,23 +59,11 @@ export class AppContext {
return this.opts.idResolver
}
get didCache(): DidRedisCache {
return this.opts.didCache
}
get redis(): Redis {
return this.opts.redis
}
get searchAgent(): AtpAgent | undefined {
return this.opts.searchAgent
}
get bsyncClient(): BsyncClient | undefined {
get bsyncClient(): BsyncClient {
return this.opts.bsyncClient
}
get courierClient(): CourierClient | undefined {
get courierClient(): CourierClient {
return this.opts.courierClient
}
@ -93,10 +79,6 @@ export class AppContext {
keypair: this.signingKey,
})
}
get backgroundQueue(): BackgroundQueue {
return this.opts.backgroundQueue
}
}
export default AppContext

@ -1,60 +0,0 @@
import assert from 'assert'
export interface DaemonConfigValues {
version: string
dbPostgresUrl: string
dbPostgresSchema?: string
notificationsDaemonFromDid?: string
}
export class DaemonConfig {
constructor(private cfg: DaemonConfigValues) {}
static readEnv(overrides?: Partial<DaemonConfigValues>) {
const version = process.env.BSKY_VERSION || '0.0.0'
const dbPostgresUrl =
overrides?.dbPostgresUrl || process.env.DB_PRIMARY_POSTGRES_URL
const dbPostgresSchema =
overrides?.dbPostgresSchema || process.env.DB_POSTGRES_SCHEMA
const notificationsDaemonFromDid =
overrides?.notificationsDaemonFromDid ||
process.env.BSKY_NOTIFS_DAEMON_FROM_DID ||
undefined
assert(dbPostgresUrl)
return new DaemonConfig({
version,
dbPostgresUrl,
dbPostgresSchema,
notificationsDaemonFromDid,
...stripUndefineds(overrides ?? {}),
})
}
get version() {
return this.cfg.version
}
get dbPostgresUrl() {
return this.cfg.dbPostgresUrl
}
get dbPostgresSchema() {
return this.cfg.dbPostgresSchema
}
get notificationsDaemonFromDid() {
return this.cfg.notificationsDaemonFromDid
}
}
function stripUndefineds(
obj: Record<string, unknown>,
): Record<string, unknown> {
const result = {}
Object.entries(obj).forEach(([key, val]) => {
if (val !== undefined) {
result[key] = val
}
})
return result
}

@ -1,27 +0,0 @@
import { PrimaryDatabase } from '../db'
import { DaemonConfig } from './config'
import { Services } from './services'
export class DaemonContext {
constructor(
private opts: {
db: PrimaryDatabase
cfg: DaemonConfig
services: Services
},
) {}
get db(): PrimaryDatabase {
return this.opts.db
}
get cfg(): DaemonConfig {
return this.opts.cfg
}
get services(): Services {
return this.opts.services
}
}
export default DaemonContext

@ -1,78 +0,0 @@
import { PrimaryDatabase } from '../db'
import { dbLogger } from '../logger'
import { DaemonConfig } from './config'
import { DaemonContext } from './context'
import { createServices } from './services'
import { ImageUriBuilder } from '../image/uri'
import { NotificationsDaemon } from './notifications'
import logger from './logger'
export { DaemonConfig } from './config'
export type { DaemonConfigValues } from './config'
export class BskyDaemon {
public ctx: DaemonContext
public notifications: NotificationsDaemon
private dbStatsInterval: NodeJS.Timer
private notifStatsInterval: NodeJS.Timer
constructor(opts: {
ctx: DaemonContext
notifications: NotificationsDaemon
}) {
this.ctx = opts.ctx
this.notifications = opts.notifications
}
static create(opts: { db: PrimaryDatabase; cfg: DaemonConfig }): BskyDaemon {
const { db, cfg } = opts
const imgUriBuilder = new ImageUriBuilder('https://daemon.invalid') // will not be used by daemon
const services = createServices({
imgUriBuilder,
})
const ctx = new DaemonContext({
db,
cfg,
services,
})
const notifications = new NotificationsDaemon(ctx)
return new BskyDaemon({ ctx, notifications })
}
async start() {
const { db, cfg } = this.ctx
const pool = db.pool
this.notifications.run({
startFromDid: cfg.notificationsDaemonFromDid,
})
this.dbStatsInterval = setInterval(() => {
dbLogger.info(
{
idleCount: pool.idleCount,
totalCount: pool.totalCount,
waitingCount: pool.waitingCount,
},
'db pool stats',
)
}, 10000)
this.notifStatsInterval = setInterval(() => {
logger.info(
{
count: this.notifications.count,
lastDid: this.notifications.lastDid,
},
'notifications daemon stats',
)
}, 10000)
return this
}
async destroy(): Promise<void> {
await this.notifications.destroy()
await this.ctx.db.close()
clearInterval(this.dbStatsInterval)
clearInterval(this.notifStatsInterval)
}
}
export default BskyDaemon

@ -1,6 +0,0 @@
import { subsystemLogger } from '@atproto/common'
const logger: ReturnType<typeof subsystemLogger> =
subsystemLogger('bsky:daemon')
export default logger

@ -1,54 +0,0 @@
import { tidyNotifications } from '../services/util/notification'
import DaemonContext from './context'
import logger from './logger'
export class NotificationsDaemon {
ac = new AbortController()
running: Promise<void> | undefined
count = 0
lastDid: string | null = null
constructor(private ctx: DaemonContext) {}
run(opts?: RunOptions) {
if (this.running) return
this.count = 0
this.lastDid = null
this.ac = new AbortController()
this.running = this.tidyNotifications({
...opts,
forever: opts?.forever !== false, // run forever by default
})
.catch((err) => {
// allow this to cause an unhandled rejection, let deployment handle the crash.
logger.error({ err }, 'notifications daemon crashed')
throw err
})
.finally(() => (this.running = undefined))
}
private async tidyNotifications(opts: RunOptions) {
const actorService = this.ctx.services.actor(this.ctx.db)
for await (const { did } of actorService.all(opts)) {
if (this.ac.signal.aborted) return
try {
await tidyNotifications(this.ctx.db, did)
this.count++
this.lastDid = did
} catch (err) {
logger.warn({ err, did }, 'failed to tidy notifications for actor')
}
}
}
async destroy() {
this.ac.abort()
await this.running
}
}
type RunOptions = {
forever?: boolean
batchSize?: number
startFromDid?: string
}

@ -1,22 +0,0 @@
import { PrimaryDatabase } from '../db'
import { ActorService } from '../services/actor'
import { ImageUriBuilder } from '../image/uri'
import { GraphService } from '../services/graph'
import { LabelService } from '../services/label'
export function createServices(resources: {
imgUriBuilder: ImageUriBuilder
}): Services {
const { imgUriBuilder } = resources
const graph = GraphService.creator(imgUriBuilder)
const label = LabelService.creator(null)
return {
actor: ActorService.creator(imgUriBuilder, graph, label),
}
}
export type Services = {
actor: FromDbPrimary<ActorService>
}
type FromDbPrimary<T> = (db: PrimaryDatabase) => T

@ -0,0 +1,98 @@
import http from 'http'
import events from 'events'
import express from 'express'
import { ConnectRouter } from '@connectrpc/connect'
import { expressConnectMiddleware } from '@connectrpc/connect-express'
import { Database } from '../server/db'
import { Service } from '../../proto/bsync_connect'
import { MuteOperation_Type } from '../../proto/bsync_pb'
import assert from 'assert'
export class MockBsync {
constructor(public server: http.Server) {}
static async create(db: Database, port: number) {
const app = express()
const routes = createRoutes(db)
app.use(expressConnectMiddleware({ routes }))
const server = app.listen(port)
await events.once(server, 'listening')
return new MockBsync(server)
}
async destroy() {
return new Promise<void>((resolve, reject) => {
this.server.close((err) => {
if (err) {
reject(err)
} else {
resolve()
}
})
})
}
}
const createRoutes = (db: Database) => (router: ConnectRouter) =>
router.service(Service, {
async addMuteOperation(req) {
const { type, actorDid, subject } = req
if (type === MuteOperation_Type.ADD) {
if (subject.startsWith('did:')) {
assert(actorDid !== subject, 'cannot mute yourself') // @TODO pass message through in http error
await db.db
.insertInto('mute')
.values({
mutedByDid: actorDid,
subjectDid: subject,
createdAt: new Date().toISOString(),
})
.onConflict((oc) => oc.doNothing())
.execute()
} else {
await db.db
.insertInto('list_mute')
.values({
mutedByDid: actorDid,
listUri: subject,
createdAt: new Date().toISOString(),
})
.onConflict((oc) => oc.doNothing())
.execute()
}
} else if (type === MuteOperation_Type.REMOVE) {
if (subject.startsWith('did:')) {
await db.db
.deleteFrom('mute')
.where('mutedByDid', '=', actorDid)
.where('subjectDid', '=', subject)
.execute()
} else {
await db.db
.deleteFrom('list_mute')
.where('mutedByDid', '=', actorDid)
.where('listUri', '=', subject)
.execute()
}
} else if (type === MuteOperation_Type.CLEAR) {
await db.db
.deleteFrom('mute')
.where('mutedByDid', '=', actorDid)
.execute()
await db.db
.deleteFrom('list_mute')
.where('mutedByDid', '=', actorDid)
.execute()
}
return {}
},
async scanMuteOperations() {
throw new Error('not implemented')
},
async ping() {
return {}
},
})

@ -0,0 +1,151 @@
import assert from 'node:assert'
import { randomInt } from 'node:crypto'
import * as ui8 from 'uint8arrays'
import {
Code,
ConnectError,
PromiseClient,
createPromiseClient,
makeAnyClient,
} from '@connectrpc/connect'
import { createGrpcTransport } from '@connectrpc/connect-node'
import { getDidKeyFromMultibase } from '@atproto/identity'
import { Service } from '../proto/bsky_connect'
export type DataPlaneClient = PromiseClient<typeof Service>
type BaseClient = { lib: DataPlaneClient; url: URL }
type HttpVersion = '1.1' | '2'
const MAX_RETRIES = 3
export const createDataPlaneClient = (
baseUrls: string[],
opts: { httpVersion?: HttpVersion; rejectUnauthorized?: boolean },
) => {
const clients = baseUrls.map((baseUrl) => createBaseClient(baseUrl, opts))
assert(clients.length > 0, 'no clients available')
return makeAnyClient(Service, (method) => {
return async (...args) => {
let tries = 0
let error: unknown
let remainingClients = clients
while (tries < MAX_RETRIES) {
const client = randomElement(remainingClients)
assert(client, 'no clients available')
try {
return await client.lib[method.localName](...args)
} catch (err) {
if (err instanceof ConnectError && err.code === Code.Unavailable) {
tries++
error = err
remainingClients = getRemainingClients(remainingClients, client)
} else {
throw err
}
}
}
assert(error)
throw error
}
}) as DataPlaneClient
}
export { Code }
export const isDataplaneError = (
err: unknown,
code?: Code,
): err is ConnectError => {
if (err instanceof ConnectError) {
return !code || err.code === code
}
return false
}
const createBaseClient = (
baseUrl: string,
opts: { httpVersion?: HttpVersion; rejectUnauthorized?: boolean },
): BaseClient => {
const { httpVersion = '2', rejectUnauthorized = true } = opts
const transport = createGrpcTransport({
baseUrl,
httpVersion,
acceptCompression: [],
nodeOptions: { rejectUnauthorized },
})
return {
lib: createPromiseClient(Service, transport),
url: new URL(baseUrl),
}
}
const getRemainingClients = (clients: BaseClient[], lastClient: BaseClient) => {
if (clients.length < 2) return clients // no clients to choose from
if (lastClient.url.port) {
// if the last client had a port, we attempt to exclude its whole host.
const maybeRemaining = clients.filter(
(c) => c.url.hostname !== lastClient.url.hostname,
)
if (maybeRemaining.length) {
return maybeRemaining
}
}
return clients.filter((c) => c !== lastClient)
}
const randomElement = <T>(arr: T[]): T | undefined => {
if (arr.length === 0) return
return arr[randomInt(arr.length)]
}
export const unpackIdentityServices = (servicesBytes: Uint8Array) => {
const servicesStr = ui8.toString(servicesBytes, 'utf8')
if (!servicesStr) return {}
return JSON.parse(servicesStr) as UnpackedServices
}
export const unpackIdentityKeys = (keysBytes: Uint8Array) => {
const keysStr = ui8.toString(keysBytes, 'utf8')
if (!keysStr) return {}
return JSON.parse(keysStr) as UnpackedKeys
}
export const getServiceEndpoint = (
services: UnpackedServices,
opts: { id: string; type: string },
) => {
const endpoint =
services[opts.id] &&
services[opts.id].Type === opts.type &&
validateUrl(services[opts.id].URL)
return endpoint || undefined
}
export const getKeyAsDidKey = (keys: UnpackedKeys, opts: { id: string }) => {
const key =
keys[opts.id] &&
getDidKeyFromMultibase({
type: keys[opts.id].Type,
publicKeyMultibase: keys[opts.id].PublicKeyMultibase,
})
return key || undefined
}
type UnpackedServices = Record<string, { Type: string; URL: string }>
type UnpackedKeys = Record<string, { Type: string; PublicKeyMultibase: string }>
const validateUrl = (urlStr: string): string | undefined => {
let url
try {
url = new URL(urlStr)
} catch {
return undefined
}
if (!['http:', 'https:'].includes(url.protocol)) {
return undefined
} else if (!url.hostname) {
return undefined
} else {
return urlStr
}
}

@ -0,0 +1,3 @@
export * from './server'
export * from './client'
export * from './bsync'

@ -1,13 +1,13 @@
import PQueue from 'p-queue'
import { PrimaryDatabase } from './db'
import { dbLogger } from './logger'
import { Database } from './db'
import { dbLogger } from '../../logger'
// A simple queue for in-process, out-of-band/backgrounded work
export class BackgroundQueue {
queue = new PQueue({ concurrency: 20 })
queue = new PQueue()
destroyed = false
constructor(public db: PrimaryDatabase) {}
constructor(public db: Database) {}
add(task: Task) {
if (this.destroyed) {
@ -32,4 +32,4 @@ export class BackgroundQueue {
}
}
type Task = (db: PrimaryDatabase) => Promise<void>
type Task = (db: Database) => Promise<void>

@ -24,6 +24,7 @@ import * as actorSync from './tables/actor-sync'
import * as record from './tables/record'
import * as notification from './tables/notification'
import * as notificationPushToken from './tables/notification-push-token'
import * as didCache from './tables/did-cache'
import * as moderation from './tables/moderation'
import * as label from './tables/label'
import * as algo from './tables/algo'
@ -58,6 +59,7 @@ export type DatabaseSchemaType = duplicateRecord.PartialDB &
record.PartialDB &
notification.PartialDB &
notificationPushToken.PartialDB &
didCache.PartialDB &
moderation.PartialDB &
label.PartialDB &
algo.PartialDB &

@ -1,35 +1,77 @@
import assert from 'assert'
import EventEmitter from 'events'
import {
Migrator,
Kysely,
KyselyPlugin,
Migrator,
PluginTransformQueryArgs,
PluginTransformResultArgs,
RootOperationNode,
PostgresDialect,
QueryResult,
RootOperationNode,
UnknownRow,
sql,
} from 'kysely'
import { Pool as PgPool } from 'pg'
import TypedEmitter from 'typed-emitter'
import { wait } from '@atproto/common'
import DatabaseSchema from './database-schema'
import { Pool as PgPool, types as pgTypes } from 'pg'
import * as migrations from './migrations'
import { CtxMigrationProvider } from './migrations/provider'
import { dbLogger as log } from '../logger'
import DatabaseSchema, { DatabaseSchemaType } from './database-schema'
import { PgOptions } from './types'
import { Database } from './db'
import { dbLogger } from '../../../logger'
import { CtxMigrationProvider } from './migrations/provider'
export class PrimaryDatabase extends Database {
export class Database {
pool: PgPool
db: DatabaseSchema
migrator: Migrator
txEvt = new EventEmitter() as TxnEmitter
destroyed = false
isPrimary = true
constructor(
public opts: PgOptions,
instances?: { db: DatabaseSchema; pool: PgPool },
instances?: { db: DatabaseSchema; pool: PgPool; migrator: Migrator },
) {
super(opts, instances)
// if instances are provided, use those
if (instances) {
this.db = instances.db
this.pool = instances.pool
this.migrator = instances.migrator
return
}
// else create a pool & connect
const { schema, url } = opts
const pool =
opts.pool ??
new PgPool({
connectionString: url,
max: opts.poolSize,
maxUses: opts.poolMaxUses,
idleTimeoutMillis: opts.poolIdleTimeoutMs,
})
// Select count(*) and other pg bigints as js integer
pgTypes.setTypeParser(pgTypes.builtins.INT8, (n) => parseInt(n, 10))
// Setup schema usage, primarily for test parallelism (each test suite runs in its own pg schema)
if (schema && !/^[a-z_]+$/i.test(schema)) {
throw new Error(`Postgres schema must only contain [A-Za-z_]: ${schema}`)
}
pool.on('error', onPoolError)
pool.on('connect', (client) => {
client.on('error', onClientError)
// Used for trigram indexes, e.g. on actor search
client.query('SET pg_trgm.word_similarity_threshold TO .4;')
if (schema) {
// Shared objects such as extensions will go in the public schema
client.query(`SET search_path TO "${schema}",public;`)
}
})
this.pool = pool
this.db = new Kysely<DatabaseSchemaType>({
dialect: new PostgresDialect({ pool }),
})
this.migrator = new Migrator({
db: this.db,
migrationTableSchema: opts.schema,
@ -37,23 +79,20 @@ export class PrimaryDatabase extends Database {
})
}
static is(db: Database): db is PrimaryDatabase {
return db.isPrimary
get schema(): string | undefined {
return this.opts.schema
}
asPrimary(): PrimaryDatabase {
return this
}
async transaction<T>(fn: (db: PrimaryDatabase) => Promise<T>): Promise<T> {
async transaction<T>(fn: (db: Database) => Promise<T>): Promise<T> {
const leakyTxPlugin = new LeakyTxPlugin()
const { dbTxn, txRes } = await this.db
.withPlugin(leakyTxPlugin)
.transaction()
.execute(async (txn) => {
const dbTxn = new PrimaryDatabase(this.opts, {
const dbTxn = new Database(this.opts, {
db: txn,
pool: this.pool,
migrator: this.migrator,
})
const txRes = await fn(dbTxn)
.catch(async (err) => {
@ -69,17 +108,23 @@ export class PrimaryDatabase extends Database {
return txRes
}
get isTransaction() {
return this.db.isTransaction
}
assertTransaction() {
assert(this.isTransaction, 'Transaction required')
}
assertNotTransaction() {
assert(!this.isTransaction, 'Cannot be in a transaction')
}
onCommit(fn: () => void) {
this.assertTransaction()
this.txEvt.once('commit', fn)
}
async close(): Promise<void> {
if (this.destroyed) return
await this.db.destroy()
this.destroyed = true
}
async migrateToOrThrow(migration: string) {
if (this.schema) {
await this.db.schema.createSchema(this.schema).ifNotExists().execute()
@ -108,48 +153,17 @@ export class PrimaryDatabase extends Database {
return results
}
async maintainMaterializedViews(opts: {
views: string[]
intervalSec: number
signal: AbortSignal
}) {
const { views, intervalSec, signal } = opts
while (!signal.aborted) {
// super basic synchronization by agreeing when the intervals land relative to unix timestamp
const now = Date.now()
const intervalMs = 1000 * intervalSec
const nextIteration = Math.ceil(now / intervalMs)
const nextInMs = nextIteration * intervalMs - now
await wait(nextInMs)
if (signal.aborted) break
await Promise.all(
views.map(async (view) => {
try {
await this.refreshMaterializedView(view)
log.info(
{ view, time: new Date().toISOString() },
'materialized view refreshed',
)
} catch (err) {
log.error(
{ view, err, time: new Date().toISOString() },
'materialized view refresh failed',
)
}
}),
)
}
}
async refreshMaterializedView(view: string) {
const { ref } = this.db.dynamic
await sql`refresh materialized view concurrently ${ref(view)}`.execute(
this.db,
)
async close(): Promise<void> {
if (this.destroyed) return
await this.db.destroy()
this.destroyed = true
}
}
export default PrimaryDatabase
export default Database
const onPoolError = (err: Error) => dbLogger.error({ err }, 'db pool error')
const onClientError = (err: Error) => dbLogger.error({ err }, 'db client error')
// utils
// -------

@ -0,0 +1 @@
export * from './db'

Some files were not shown because too many files have changed in this diff Show More