Skip to content

Leo/less pagination #3387

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jun 22, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion lib/MetaCPAN/Web/Role/Request.pm
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,9 @@ sub get_page_size {
my $default_page_size = shift;

my $page_size = $req->param('size');
unless ( is_PositiveInt($page_size) && $page_size <= 500 ) {

# We no longer support more than 100 results per page
unless ( is_PositiveInt($page_size) && $page_size <= 100 ) {
$page_size = $default_page_size;
}
return $page_size;
Expand Down
14 changes: 3 additions & 11 deletions root/inc/pager.tx
Original file line number Diff line number Diff line change
Expand Up @@ -3,31 +3,23 @@
<div class="text-center">
<ul class="pagination">
<li class="[% if !$pageset.previous_page { 'disabled' } %]">
<a href="[% $page_url({ p => $pageset.previous_page }) %]">«</a>
<a href="[% $page_url({ p => $pageset.previous_page, size => $pageset.entries_per_page }) %]">«</a>
</li>

%% for $pageset.pages_in_set -> $page_num {
<li [% if $page_num == $pageset.current_page { %] class="active"[% } %]>
<a href="[% $page_url({ p => $page_num }) %]">[% $page_num %]</a>
<a href="[% $page_url({ p => $page_num, size => $pageset.entries_per_page }) %]">[% $page_num %]</a>
</li>
%% }

<li class="[% if !$pageset.next_page { 'disabled' } %]">
<a href="[% $page_url({ p => $pageset.next_page }) %]">»</a>
<a href="[% $page_url({ p => $pageset.next_page, size => $pageset.entries_per_page }) %]">»</a>
</li>
</ul>
</div>
%% }

<div class="text-center">
<ul class="pagination">
<li class="disabled"><a>Results per page:</a></li>
%% for [10, 20, 50, 100, 200, 500] -> $page_size {
<li [% if $page_size == $pageset.entries_per_page { %] class="active"[% } %]>
<a href="[% $page_url({ p => $pageset.current_page, size => $page_size}) %]">[% $page_size %]</a>
</li>
%% }
</ul>
<div class="smaller">
[% $pageset.total_entries | format_number %]
[% pluralize("result", $pageset.total_entries) %]
Expand Down
67 changes: 1 addition & 66 deletions root/robots.txt
Original file line number Diff line number Diff line change
Expand Up @@ -14,69 +14,4 @@ Disallow: /*?*size=*
Sitemap: https://metacpan.org/sitemap-authors.xml.gz
Sitemap: https://metacpan.org/sitemap-releases.xml.gz

# Stop the bots, using list from:
# https://github.com/ai-robots-txt/ai.robots.txt/blob/main/robots.txt
User-agent: AI2Bot
User-agent: Ai2Bot-Dolma
User-agent: aiHitBot
User-agent: Amazonbot
User-agent: anthropic-ai
User-agent: Applebot
User-agent: Applebot-Extended
User-agent: Brightbot 1.0
User-agent: Bytespider
User-agent: CCBot
User-agent: ChatGPT-User
User-agent: Claude-SearchBot
User-agent: Claude-User
User-agent: Claude-Web
User-agent: ClaudeBot
User-agent: cohere-ai
User-agent: cohere-training-data-crawler
User-agent: Cotoyogi
User-agent: Crawlspace
User-agent: Diffbot
User-agent: DuckAssistBot
User-agent: FacebookBot
User-agent: Factset_spyderbot
User-agent: FirecrawlAgent
User-agent: FriendlyCrawler
User-agent: Google-CloudVertexBot
User-agent: Google-Extended
User-agent: GoogleOther
User-agent: GoogleOther-Image
User-agent: GoogleOther-Video
User-agent: GPTBot
User-agent: iaskspider/2.0
User-agent: ICC-Crawler
User-agent: ImagesiftBot
User-agent: img2dataset
User-agent: imgproxy
User-agent: ISSCyberRiskCrawler
User-agent: Kangaroo Bot
User-agent: meta-externalagent
User-agent: Meta-ExternalAgent
User-agent: meta-externalfetcher
User-agent: Meta-ExternalFetcher
User-agent: MistralAI-User/1.0
User-agent: NovaAct
User-agent: OAI-SearchBot
User-agent: omgili
User-agent: omgilibot
User-agent: Operator
User-agent: PanguBot
User-agent: Perplexity-User
User-agent: PerplexityBot
User-agent: PetalBot
User-agent: QualifiedBot
User-agent: Scrapy
User-agent: SemrushBot-OCOB
User-agent: SemrushBot-SWA
User-agent: Sidetrade indexer bot
User-agent: TikTokSpider
User-agent: Timpibot
User-agent: VelenPublicWebCrawler
User-agent: Webzio-Extended
User-agent: wpbot
User-agent: YouBot
Disallow: /
# Stop the bots - using signalsciences to block
Loading