Skip to content

Commit a211a56

Browse files
committed
Made some design changes to the readme file
1 parent df36c01 commit a211a56

File tree

1 file changed

+46
-22
lines changed

1 file changed

+46
-22
lines changed

README.md

Lines changed: 46 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,24 +1,28 @@
1-
# @scrape-do/client
2-
3-
#### Scrape.do's official http client for node.js
1+
<p align="center">
2+
<img width="100" height="100" src="https://avatars.githubusercontent.com/u/67231321?s=200&v=4">
3+
<h3 align="center">Scrape Do Node Client</h3>
4+
<p align="center">Scrape.do's official http client for node.js</p>
5+
</p>
46

57
## How to install?
68

79
```bash
8-
> npm install @scrape-do/client
9-
# or get it from github
10-
> npm install git://git@github.com/scrape-do/node-client
10+
npm i @scrape-do/client
1111
```
12+
or install with github
1213

13-
## How to build from scratch
14+
```bash
15+
npm install git://git@github.com/scrape-do/node-client
16+
```
1417

15-
#### If you want to contribute to the library or include your own customisations, you can recompile the library in this way.
18+
## How to build from scratch
19+
If you want to contribute to the library or include your own customisations, you can recompile the library in this way.
1620

1721
```bash
18-
> git clone https://github.com/scrape-do/node-client
19-
> npm i
22+
git clone https://github.com/scrape-do/node-client
23+
npm i
2024
# build with
21-
> npm build
25+
npm build
2226
```
2327

2428
## Example Usages
@@ -28,7 +32,9 @@
2832
The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
2933

3034
```typescript
31-
const client = new ScrapeDo("example_token");
35+
const { ScrapeDo } = require("@scrape-do/client");
36+
37+
const client = new ScrapeDo("your_api_token");
3238
const response = await client.sendRequest("GET", {
3339
url: "https://httpbin.co/anything",
3440
super: true,
@@ -42,7 +48,9 @@ console.log(response);
4248
The geoCode parameter allows you to specify the geographic location from which the request should appear to originate. By setting a specific country code, such as "us" for the United States, the request will be routed through an IP address from that region. This is especially useful for scraping websites that serve region-specific content or pricing, allowing you to access data as if you were browsing from that location.
4349

4450
```typescript
45-
const client = new ScrapeDo("example_token");
51+
const { ScrapeDo } = require("@scrape-do/client");
52+
53+
const client = new ScrapeDo("your_api_token");
4654
const response = await client.sendRequest("GET", {
4755
url: "https://httpbin.co/anything",
4856
geoCode: "us",
@@ -56,7 +64,9 @@ console.log(response);
5664
The regionalGeoCode parameter allows you to target requests from a broader geographic region, rather than a specific country. By specifying a regional code such as "europe" or "asia", your request will be routed through an IP address from that particular region. This is useful for scraping content that may be region-restricted, or for accessing region-specific data without the need to specify individual country codes.
5765

5866
```typescript
59-
const client = new ScrapeDo("example_token");
67+
const { ScrapeDo } = require("@scrape-do/client");
68+
69+
const client = new ScrapeDo("your_api_token");
6070
const response = await client.sendRequest("GET", {
6171
url: "https://httpbin.co/anything",
6272
regionalGeoCode: "europe",
@@ -79,7 +89,9 @@ Key points to note:
7989
- Sessions only for successful requests: A session will only be created if the initial request is successful.
8090

8191
```typescript
82-
const client = new ScrapeDo("example_token");
92+
const { ScrapeDo } = require("@scrape-do/client");
93+
94+
const client = new ScrapeDo("your_api_token");
8395
const response = await client.sendRequest("GET", {
8496
url: "https://httpbin.co/anything",
8597
sessionId: "1234",
@@ -93,7 +105,9 @@ console.log(response);
93105
The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
94106

95107
```typescript
96-
const client = new ScrapeDo("example_token");
108+
const { ScrapeDo } = require("@scrape-do/client");
109+
110+
const client = new ScrapeDo("your_api_token");
97111
const response = await client.sendRequest("GET", {
98112
url: "https://httpbin.co/anything",
99113
customHeaders: {
@@ -111,7 +125,9 @@ extraHeaders is used when you want to add one or more headers specifically requi
111125
The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
112126

113127
```typescript
114-
const client = new ScrapeDo("example_token");
128+
const { ScrapeDo } = require("@scrape-do/client");
129+
130+
const client = new ScrapeDo("your_api_token");
115131
const response = await client.sendRequest("GET", {
116132
url: "https://httpbin.co/anything",
117133
extraHeaders: {
@@ -127,7 +143,9 @@ console.log(response);
127143
The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
128144

129145
```typescript
130-
const client = new ScrapeDo("example_token");
146+
const { ScrapeDo } = require("@scrape-do/client");
147+
148+
const client = new ScrapeDo("your_api_token");
131149
const response = await client.sendRequest("GET", {
132150
url: "https://httpbin.co/anything",
133151
forwardHeaders: {
@@ -143,7 +161,9 @@ console.log(response);
143161
The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
144162

145163
```typescript
146-
const client = new ScrapeDo("example_token");
164+
const { ScrapeDo } = require("@scrape-do/client");
165+
166+
const client = new ScrapeDo("your_api_token");
147167
const response = await client.sendRequest("GET", {
148168
url: "https://httpbin.co/anything",
149169
render: true,
@@ -168,7 +188,9 @@ Key information retrieved:
168188
> For security reasons, you can send up to 10 requests per minute to this endpoint. If you exceed this rate, you will receive a 429 Too Many Requests error.
169189
170190
```typescript
171-
const client = new ScrapeDo("example_token");
191+
const { ScrapeDo } = require("@scrape-do/client");
192+
193+
const client = new ScrapeDo("your_api_token");
172194
const stats = await client.statistics();
173195

174196
console.log(stats);
@@ -184,7 +206,9 @@ In this example, multiple parameters are combined to showcase advanced scraping
184206
- [playWithBrowser](https://scrape.do/documentation/#play-with-browser?utm_source=github&utm_medium=node-client): Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
185207

186208
```typescript
187-
const client = new ScrapeDo("example_token");
209+
const { ScrapeDo } = require("@scrape-do/client");
210+
211+
const client = new ScrapeDo("your_api_token");
188212
const response = await client.sendRequest("GET", {
189213
url: "https://example.com",
190214
render: true,
@@ -215,4 +239,4 @@ console.log(response);
215239

216240
## Disclaimer
217241

218-
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
242+
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.

0 commit comments

Comments
 (0)