Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 25 additions & 4 deletions stats/lib/platform-stats-fetchers.js
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,8 @@ import { today, yesterday } from './request-helpers.js'

/** @typedef {import('@filecoin-station/spark-stats-db').Queryable} Queryable */

const ONE_DAY = 24 * 60 * 60 * 1000

/**
* @param {Queryable} pgPool
* @param {import('./typings.js').DateRangeFilter} filter
Expand Down Expand Up @@ -67,14 +69,33 @@ export const fetchParticipantsWithTopMeasurements = async (pgPool, filter) => {
* @param {import('./typings.js').DateRangeFilter} filter
*/
export const fetchDailyRewardTransfers = async (pgPool, filter) => {
assert(
new Date(filter.to).getTime() - new Date(filter.from).getTime() <= 31 * ONE_DAY,
400,
'Date range must be 31 days max'
)
const { rows } = await pgPool.query(`
SELECT day::TEXT, SUM(amount) as amount
SELECT day::TEXT, to_address, amount
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be possible to create some kind of iterator using something like pg-cursor so we could avoid loading all rows into memory in order to do calculation?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point.

I ran this query against the live database, and it returned less than 2000 rows. I think that's because we have data from 2024-12-02 and 2024-12-09 only. When I extrapolate that to the entire year (52 weeks) and account for double network size, we are talking about ~110k rows. Loading that into memory at once seems doable to me. WDYT?

EXPLAIN ANALYZE
SELECT day::TEXT, to_address, amount 
FROM daily_reward_transfers 
WHERE day >= '2024-12-01' AND day <= '2024-12-31';

                                                                        QUERY PLAN                                                                         
-----------------------------------------------------------------------------------------------------------------------------------------------------------
 Index Scan using daily_reward_transfers_day on daily_reward_transfers  (cost=0.29..72.24 rows=1862 width=87) (actual time=0.032..0.586 rows=2189 loops=1)
   Index Cond: ((day >= '2024-12-01'::date) AND (day <= '2024-12-31'::date))
 Planning Time: 0.106 ms
 Execution Time: 0.720 ms

Having wrote that, I have a different concern:

The execution time for two weeks' worth of data is 0.720ms. That would be like 18 seconds to query the entire year. I think we need to impose some limits on the filter interval this API endpoint allows.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bajtos Indeed, limiting the range would be more optimal.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

FROM daily_reward_transfers
WHERE day >= $1 AND day <= $2
GROUP BY day
ORDER BY day
`, [filter.from, filter.to])
return rows
const days = {}
for (const row of rows) {
if (!days[row.day]) {
days[row.day] = {
day: row.day,
amount: '0',
transfers: []
}
}
const day = days[row.day]
day.amount = String(BigInt(day.amount) + BigInt(row.amount))
day.transfers.push({
toAddress: row.to_address,
amount: row.amount
})
}
return Object.values(days)
}

/**
Expand Down
37 changes: 35 additions & 2 deletions stats/test/platform-routes.test.js
Original file line number Diff line number Diff line change
Expand Up @@ -243,10 +243,43 @@ describe('Platform Routes HTTP request handler', () => {
await assertResponseStatus(res, 200)
const metrics = await res.json()
assert.deepStrictEqual(metrics, [
{ day: '2024-01-11', amount: '150' },
{ day: '2024-01-12', amount: '550' }
{
day: '2024-01-11',
amount: '150',
transfers: [
{
toAddress: 'to2',
amount: '150'
}
]
},
{
day: '2024-01-12',
amount: '550',
transfers: [
{
toAddress: 'to2',
amount: '300'
},
{
toAddress: 'to3',
amount: '250'
}
]
}
])
})
it('returns 400 if the date range is more than 31 days', async () => {
const res = await fetch(
new URL(
'/transfers/daily?from=2024-01-01&to=2024-02-02',
baseUrl
), {
redirect: 'manual'
}
)
await assertResponseStatus(res, 400)
})
})

describe('GET /participants/top-earning', () => {
Expand Down
Loading