Skip to content

Commit aae7ed7

Browse files
authored
Parse the responses into a user-friendly format and update the readme (#1)
* chore: update readme * fix: add encoding to setCookies * feat: handle api responses and convert typescript types * feat!: seperate types && parse responses to types * fix: add actionResult type and remove debug lines * chore: add statistics documentation
1 parent 9b88c85 commit aae7ed7

File tree

4 files changed

+379
-201
lines changed

4 files changed

+379
-201
lines changed

README.md

Lines changed: 95 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,9 @@
2323

2424
## Example Usages
2525

26-
### super (residental proxy)
26+
### [Super (Residential & Mobile)](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client)
2727

28-
#### The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
28+
The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
2929

3030
```typescript
3131
const client = new ScrapeDo("example_token");
@@ -34,12 +34,63 @@ const response = await client.sendRequest("GET", {
3434
super: true,
3535
});
3636

37-
console.log(response.data);
37+
console.log(response);
3838
```
3939

40-
### customHeaders
40+
### [Geo Targeting](https://scrape.do/documentation/#geo-targeting?utm_source=github&utm_medium=node-client)
4141

42-
#### The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
42+
The geoCode parameter allows you to specify the geographic location from which the request should appear to originate. By setting a specific country code, such as "us" for the United States, the request will be routed through an IP address from that region. This is especially useful for scraping websites that serve region-specific content or pricing, allowing you to access data as if you were browsing from that location.
43+
44+
```typescript
45+
const client = new ScrapeDo("example_token");
46+
const response = await client.sendRequest("GET", {
47+
url: "https://httpbin.co/anything",
48+
geoCode: "us",
49+
});
50+
51+
console.log(response);
52+
```
53+
54+
### [Regional Geo Targeting](https://scrape.do/documentation/#regional-geo-targeting?utm_source=github&utm_medium=node-client)
55+
56+
The regionalGeoCode parameter allows you to target requests from a broader geographic region, rather than a specific country. By specifying a regional code such as "europe" or "asia", your request will be routed through an IP address from that particular region. This is useful for scraping content that may be region-restricted, or for accessing region-specific data without the need to specify individual country codes.
57+
58+
```typescript
59+
const client = new ScrapeDo("example_token");
60+
const response = await client.sendRequest("GET", {
61+
url: "https://httpbin.co/anything",
62+
regionalGeoCode: "europe",
63+
});
64+
65+
console.log(response);
66+
```
67+
68+
### [Sticky Sessions](https://scrape.do/documentation/#sticky-sessions?utm_source=github&utm_medium=node-client)
69+
70+
The sessionId parameter enables you to use the same proxy address for multiple requests over a certain period. By passing a unique integer value (e.g., sessionId=1234), you can maintain a persistent session with a single proxy. This is useful when you need consistent IP continuity for scraping operations or interacting with websites that track user sessions.
71+
72+
Key points to note:
73+
74+
- Session ID Range: The sessionId must be an integer between 0 and 1,000,000.
75+
- Session Timeout: If no request is made using the sessionId for 5 minutes, the session will automatically expire.
76+
- Session Failure: If a request made with a session ID fails, a new proxy will be assigned, and the session will reset.
77+
- Geo Targeting Compatibility: When used with Geo Targeting or Regional Geo Targeting, the session will be locked to the specified country or region.
78+
- No session required for new proxies: If you want to use a different proxy for each request, you don’t need to set a sessionId.
79+
- Sessions only for successful requests: A session will only be created if the initial request is successful.
80+
81+
```typescript
82+
const client = new ScrapeDo("example_token");
83+
const response = await client.sendRequest("GET", {
84+
url: "https://httpbin.co/anything",
85+
sessionId: "1234",
86+
});
87+
88+
console.log(response);
89+
```
90+
91+
### [Custom Headers](https://scrape.do/documentation/#custom-headers?utm_source=github&utm_medium=node-client)
92+
93+
The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
4394

4495
```typescript
4596
const client = new ScrapeDo("example_token");
@@ -50,14 +101,14 @@ const response = await client.sendRequest("GET", {
50101
},
51102
});
52103

53-
console.log(response.data);
104+
console.log(response);
54105
```
55106

56-
### extraHeaders
107+
### [Extra Headers](https://scrape.do/documentation/#extra-headers?utm_source=github&utm_medium=node-client)
57108

58-
#### extraHeaders is used when you want to add one or more headers specifically required by the target website, without altering the core headers automatically generated by the service. This is useful for passing additional information while maintaining the integrity of the existing request headers.
109+
extraHeaders is used when you want to add one or more headers specifically required by the target website, without altering the core headers automatically generated by the service. This is useful for passing additional information while maintaining the integrity of the existing request headers.
59110

60-
#### The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
111+
The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
61112

62113
```typescript
63114
const client = new ScrapeDo("example_token");
@@ -68,12 +119,12 @@ const response = await client.sendRequest("GET", {
68119
},
69120
});
70121

71-
console.log(response.data);
122+
console.log(response);
72123
```
73124

74-
### forwardHeaders
125+
### [Forward Headers](https://scrape.do/documentation/#forward-headers?utm_source=github&utm_medium=node-client)
75126

76-
#### The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
127+
The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
77128

78129
```typescript
79130
const client = new ScrapeDo("example_token");
@@ -84,12 +135,12 @@ const response = await client.sendRequest("GET", {
84135
},
85136
});
86137

87-
console.log(response.data);
138+
console.log(response);
88139
```
89140

90-
### render (javascript execution - humanized browser rendering)
141+
### [JS Render](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client)
91142

92-
#### The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
143+
The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
93144

94145
```typescript
95146
const client = new ScrapeDo("example_token");
@@ -98,12 +149,34 @@ const response = await client.sendRequest("GET", {
98149
render: true,
99150
});
100151

101-
console.log(response.data);
152+
console.log(response);
153+
```
154+
155+
### [Get account statistics](https://scrape.do/documentation/#usage-statistics-api?utm_source=github&utm_medium=node-client)
156+
157+
The statistics() method allows you to retrieve real-time usage statistics for your subscription. This API call returns details such as your current subscription status, the number of concurrent requests allowed, the total and remaining requests per month, and how many concurrent requests are still available.
158+
159+
Key information retrieved:
160+
161+
- IsActive: Indicates whether your subscription is active.
162+
- ConcurrentRequest: The total number of concurrent requests your subscription supports.
163+
- MaxMonthlyRequest: The maximum number of requests allowed per month.
164+
- RemainingConcurrentRequest: The number of concurrent requests you have left at the current time.
165+
- RemainingMonthlyRequest: The remaining number of requests you can send this month.
166+
167+
> [!WARNING]
168+
> For security reasons, you can send up to 10 requests per minute to this endpoint. If you exceed this rate, you will receive a 429 Too Many Requests error.
169+
170+
```typescript
171+
const client = new ScrapeDo("example_token");
172+
const stats = await client.statistics();
173+
174+
console.log(stats);
102175
```
103176

104-
### final bonus example (render, super, geoCode, playWithBrowser)
177+
### Final bonus example (render, super, geoCode, playWithBrowser)
105178

106-
#### In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
179+
In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
107180

108181
- [render: true](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
109182
- [super: true](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
@@ -113,7 +186,7 @@ console.log(response.data);
113186
```typescript
114187
const client = new ScrapeDo("example_token");
115188
const response = await client.sendRequest("GET", {
116-
url: "https://ExampleProtectedDomain.com/prices",
189+
url: "https://example.com",
117190
render: true,
118191
super: true,
119192
geoCode: "us",
@@ -125,7 +198,7 @@ const response = await client.sendRequest("GET", {
125198
],
126199
});
127200

128-
console.log(response.data);
201+
console.log(response);
129202
```
130203

131204
## Official links
@@ -138,8 +211,8 @@ console.log(response.data);
138211

139212
## License
140213

141-
### This project is licensed under the MIT License. See the [LICENSE](./LICENSE) file for more details.
214+
#### This project is licensed under the MIT License. See the [LICENSE](./LICENSE) file for more details.
142215

143216
## Disclaimer
144217

145-
### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
218+
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.

0 commit comments

Comments
 (0)