Skip to content

Commit 7e39f38

Browse files
committed
Some improvements to the readme file
1 parent 1da76a6 commit 7e39f38

File tree

1 file changed

+33
-10
lines changed

1 file changed

+33
-10
lines changed

README.md

Lines changed: 33 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,14 @@
11
<p align="center">
22
<img width="100" height="100" src="https://avatars.githubusercontent.com/u/67231321?s=200&v=4">
33
<h3 align="center">Scrape Do Node Client</h3>
4-
<p align="center">Scrape.do's official http client for node.js</p>
4+
<p align="center">Get unblocked while scraping the web - we bypass anti-bots and rotate proxies while you only pay for successful requests.</p>
5+
6+
<p align="center">
7+
<img src="https://img.shields.io/npm/v/@scrape-do/client/" />
8+
<img src="https://github.com/scrape-do/node-client/actions/workflows/build-test.yml/badge.svg?branch=main" />
9+
<img src="https://img.shields.io/github/issues/scrape-do/node-client" alt="Issues" />
10+
<img src="https://img.shields.io/github/license/scrape-do/node-client" alt="License" />
11+
</p>
512
</p>
613

714
## How to install?
@@ -15,14 +22,16 @@ or install with github
1522
npm install git://git@github.com/scrape-do/node-client
1623
```
1724

18-
## How to build from scratch
19-
If you want to contribute to the library or include your own customisations, you can recompile the library in this way.
25+
## How Do I Import the Library?
2026

21-
```bash
22-
git clone https://github.com/scrape-do/node-client
23-
npm i
24-
# build with
25-
npm build
27+
```js
28+
// CommonJS
29+
const { ScrapeDo } = require("@scrape-do/client");
30+
```
31+
32+
```typescript
33+
// Module - TypeScript
34+
import { ScrapeDo } from '@scrape-do/client'
2635
```
2736

2837
## Example Usages
@@ -200,8 +209,11 @@ console.log(stats);
200209

201210
In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
202211

203-
- [render: true](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
204-
- [super: true](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
212+
> [!WARNING]
213+
> The browser created with this endpoint can be detected. It can be used for simple tasks such as waiting for the page to load, interacting with the page in your scraping tasks.
214+
215+
- [render](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
216+
- [super](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
205217
- [geoCode](https://scrape.do/documentation/#geo-targeting?utm_source=github&utm_medium=node-client): "us": Targets a specific geographic location for the request, in this case, the United States. This is useful for scraping content that varies by region, such as localized prices or region-specific data.
206218
- [playWithBrowser](https://scrape.do/documentation/#play-with-browser?utm_source=github&utm_medium=node-client): Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
207219

@@ -225,6 +237,17 @@ const response = await client.sendRequest("GET", {
225237
console.log(response);
226238
```
227239

240+
## How to build from scratch
241+
If you want to contribute to the library or include your own customisations, you can recompile the library in this way.
242+
243+
```bash
244+
git clone https://github.com/scrape-do/node-client
245+
npm i
246+
# build with
247+
npm build
248+
```
249+
250+
228251
## Official links
229252

230253
- [Scrape.do](https://scrape.do?utm_source=github&utm_medium=node-client)

0 commit comments

Comments
 (0)