-
Notifications
You must be signed in to change notification settings - Fork 4
Add individual daily transfers. Closes #253 #268
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -3,6 +3,8 @@ import { today, yesterday } from './request-helpers.js' | |
|
||
/** @typedef {import('@filecoin-station/spark-stats-db').Queryable} Queryable */ | ||
|
||
const ONE_DAY = 24 * 60 * 60 * 1000 | ||
|
||
/** | ||
* @param {Queryable} pgPool | ||
* @param {import('./typings.js').DateRangeFilter} filter | ||
|
@@ -67,14 +69,33 @@ export const fetchParticipantsWithTopMeasurements = async (pgPool, filter) => { | |
* @param {import('./typings.js').DateRangeFilter} filter | ||
*/ | ||
export const fetchDailyRewardTransfers = async (pgPool, filter) => { | ||
assert( | ||
new Date(filter.to).getTime() - new Date(filter.from).getTime() <= 31 * ONE_DAY, | ||
400, | ||
'Date range must be 31 days max' | ||
) | ||
const { rows } = await pgPool.query(` | ||
SELECT day::TEXT, SUM(amount) as amount | ||
SELECT day::TEXT, to_address, amount | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Would it be possible to create some kind of iterator using something like pg-cursor so we could avoid loading all rows into memory in order to do calculation? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good point. I ran this query against the live database, and it returned less than 2000 rows. I think that's because we have data from 2024-12-02 and 2024-12-09 only. When I extrapolate that to the entire year (52 weeks) and account for double network size, we are talking about ~110k rows. Loading that into memory at once seems doable to me. WDYT?
Having wrote that, I have a different concern: The execution time for two weeks' worth of data is 0.720ms. That would be like 18 seconds to query the entire year. I think we need to impose some limits on the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @bajtos Indeed, limiting the range would be more optimal. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. |
||
FROM daily_reward_transfers | ||
WHERE day >= $1 AND day <= $2 | ||
GROUP BY day | ||
ORDER BY day | ||
`, [filter.from, filter.to]) | ||
return rows | ||
const days = {} | ||
for (const row of rows) { | ||
if (!days[row.day]) { | ||
days[row.day] = { | ||
day: row.day, | ||
amount: '0', | ||
transfers: [] | ||
} | ||
} | ||
const day = days[row.day] | ||
day.amount = String(BigInt(day.amount) + BigInt(row.amount)) | ||
juliangruber marked this conversation as resolved.
Show resolved
Hide resolved
|
||
day.transfers.push({ | ||
toAddress: row.to_address, | ||
amount: row.amount | ||
}) | ||
} | ||
return Object.values(days) | ||
} | ||
|
||
/** | ||
|
Uh oh!
There was an error while loading. Please reload this page.