You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
29
42
30
43
```typescript
31
-
const client =newScrapeDo("example_token");
44
+
const { ScrapeDo } =require("@scrape-do/client");
45
+
46
+
const client =newScrapeDo("your_api_token");
32
47
const response =awaitclient.sendRequest("GET", {
33
48
url: "https://httpbin.co/anything",
34
49
super: true,
@@ -42,7 +57,9 @@ console.log(response);
42
57
The geoCode parameter allows you to specify the geographic location from which the request should appear to originate. By setting a specific country code, such as "us" for the United States, the request will be routed through an IP address from that region. This is especially useful for scraping websites that serve region-specific content or pricing, allowing you to access data as if you were browsing from that location.
43
58
44
59
```typescript
45
-
const client =newScrapeDo("example_token");
60
+
const { ScrapeDo } =require("@scrape-do/client");
61
+
62
+
const client =newScrapeDo("your_api_token");
46
63
const response =awaitclient.sendRequest("GET", {
47
64
url: "https://httpbin.co/anything",
48
65
geoCode: "us",
@@ -56,7 +73,9 @@ console.log(response);
56
73
The regionalGeoCode parameter allows you to target requests from a broader geographic region, rather than a specific country. By specifying a regional code such as "europe" or "asia", your request will be routed through an IP address from that particular region. This is useful for scraping content that may be region-restricted, or for accessing region-specific data without the need to specify individual country codes.
57
74
58
75
```typescript
59
-
const client =newScrapeDo("example_token");
76
+
const { ScrapeDo } =require("@scrape-do/client");
77
+
78
+
const client =newScrapeDo("your_api_token");
60
79
const response =awaitclient.sendRequest("GET", {
61
80
url: "https://httpbin.co/anything",
62
81
regionalGeoCode: "europe",
@@ -79,7 +98,9 @@ Key points to note:
79
98
- Sessions only for successful requests: A session will only be created if the initial request is successful.
80
99
81
100
```typescript
82
-
const client =newScrapeDo("example_token");
101
+
const { ScrapeDo } =require("@scrape-do/client");
102
+
103
+
const client =newScrapeDo("your_api_token");
83
104
const response =awaitclient.sendRequest("GET", {
84
105
url: "https://httpbin.co/anything",
85
106
sessionId: "1234",
@@ -93,7 +114,9 @@ console.log(response);
93
114
The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
94
115
95
116
```typescript
96
-
const client =newScrapeDo("example_token");
117
+
const { ScrapeDo } =require("@scrape-do/client");
118
+
119
+
const client =newScrapeDo("your_api_token");
97
120
const response =awaitclient.sendRequest("GET", {
98
121
url: "https://httpbin.co/anything",
99
122
customHeaders: {
@@ -111,7 +134,9 @@ extraHeaders is used when you want to add one or more headers specifically requi
111
134
The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
112
135
113
136
```typescript
114
-
const client =newScrapeDo("example_token");
137
+
const { ScrapeDo } =require("@scrape-do/client");
138
+
139
+
const client =newScrapeDo("your_api_token");
115
140
const response =awaitclient.sendRequest("GET", {
116
141
url: "https://httpbin.co/anything",
117
142
extraHeaders: {
@@ -127,7 +152,9 @@ console.log(response);
127
152
The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
128
153
129
154
```typescript
130
-
const client =newScrapeDo("example_token");
155
+
const { ScrapeDo } =require("@scrape-do/client");
156
+
157
+
const client =newScrapeDo("your_api_token");
131
158
const response =awaitclient.sendRequest("GET", {
132
159
url: "https://httpbin.co/anything",
133
160
forwardHeaders: {
@@ -143,7 +170,9 @@ console.log(response);
143
170
The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
144
171
145
172
```typescript
146
-
const client =newScrapeDo("example_token");
173
+
const { ScrapeDo } =require("@scrape-do/client");
174
+
175
+
const client =newScrapeDo("your_api_token");
147
176
const response =awaitclient.sendRequest("GET", {
148
177
url: "https://httpbin.co/anything",
149
178
render: true,
@@ -168,7 +197,9 @@ Key information retrieved:
168
197
> For security reasons, you can send up to 10 requests per minute to this endpoint. If you exceed this rate, you will receive a 429 Too Many Requests error.
169
198
170
199
```typescript
171
-
const client =newScrapeDo("example_token");
200
+
const { ScrapeDo } =require("@scrape-do/client");
201
+
202
+
const client =newScrapeDo("your_api_token");
172
203
const stats =awaitclient.statistics();
173
204
174
205
console.log(stats);
@@ -178,13 +209,18 @@ console.log(stats);
178
209
179
210
In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
180
211
181
-
-[render: true](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
182
-
-[super: true](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
212
+
> [!WARNING]
213
+
> The browser created with this endpoint can be detected. It can be used for simple tasks such as waiting for the page to load, interacting with the page in your scraping tasks.
214
+
215
+
-[render](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
216
+
-[super](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
183
217
-[geoCode](https://scrape.do/documentation/#geo-targeting?utm_source=github&utm_medium=node-client): "us": Targets a specific geographic location for the request, in this case, the United States. This is useful for scraping content that varies by region, such as localized prices or region-specific data.
184
218
-[playWithBrowser](https://scrape.do/documentation/#play-with-browser?utm_source=github&utm_medium=node-client): Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
265
+
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
0 commit comments