You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
29
33
30
34
```typescript
31
-
const client =newScrapeDo("example_token");
35
+
const { ScrapeDo } =require("@scrape-do/client");
36
+
37
+
const client =newScrapeDo("your_api_token");
32
38
const response =awaitclient.sendRequest("GET", {
33
39
url: "https://httpbin.co/anything",
34
40
super: true,
@@ -42,7 +48,9 @@ console.log(response);
42
48
The geoCode parameter allows you to specify the geographic location from which the request should appear to originate. By setting a specific country code, such as "us" for the United States, the request will be routed through an IP address from that region. This is especially useful for scraping websites that serve region-specific content or pricing, allowing you to access data as if you were browsing from that location.
43
49
44
50
```typescript
45
-
const client =newScrapeDo("example_token");
51
+
const { ScrapeDo } =require("@scrape-do/client");
52
+
53
+
const client =newScrapeDo("your_api_token");
46
54
const response =awaitclient.sendRequest("GET", {
47
55
url: "https://httpbin.co/anything",
48
56
geoCode: "us",
@@ -56,7 +64,9 @@ console.log(response);
56
64
The regionalGeoCode parameter allows you to target requests from a broader geographic region, rather than a specific country. By specifying a regional code such as "europe" or "asia", your request will be routed through an IP address from that particular region. This is useful for scraping content that may be region-restricted, or for accessing region-specific data without the need to specify individual country codes.
57
65
58
66
```typescript
59
-
const client =newScrapeDo("example_token");
67
+
const { ScrapeDo } =require("@scrape-do/client");
68
+
69
+
const client =newScrapeDo("your_api_token");
60
70
const response =awaitclient.sendRequest("GET", {
61
71
url: "https://httpbin.co/anything",
62
72
regionalGeoCode: "europe",
@@ -79,7 +89,9 @@ Key points to note:
79
89
- Sessions only for successful requests: A session will only be created if the initial request is successful.
80
90
81
91
```typescript
82
-
const client =newScrapeDo("example_token");
92
+
const { ScrapeDo } =require("@scrape-do/client");
93
+
94
+
const client =newScrapeDo("your_api_token");
83
95
const response =awaitclient.sendRequest("GET", {
84
96
url: "https://httpbin.co/anything",
85
97
sessionId: "1234",
@@ -93,7 +105,9 @@ console.log(response);
93
105
The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
94
106
95
107
```typescript
96
-
const client =newScrapeDo("example_token");
108
+
const { ScrapeDo } =require("@scrape-do/client");
109
+
110
+
const client =newScrapeDo("your_api_token");
97
111
const response =awaitclient.sendRequest("GET", {
98
112
url: "https://httpbin.co/anything",
99
113
customHeaders: {
@@ -111,7 +125,9 @@ extraHeaders is used when you want to add one or more headers specifically requi
111
125
The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
112
126
113
127
```typescript
114
-
const client =newScrapeDo("example_token");
128
+
const { ScrapeDo } =require("@scrape-do/client");
129
+
130
+
const client =newScrapeDo("your_api_token");
115
131
const response =awaitclient.sendRequest("GET", {
116
132
url: "https://httpbin.co/anything",
117
133
extraHeaders: {
@@ -127,7 +143,9 @@ console.log(response);
127
143
The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
128
144
129
145
```typescript
130
-
const client =newScrapeDo("example_token");
146
+
const { ScrapeDo } =require("@scrape-do/client");
147
+
148
+
const client =newScrapeDo("your_api_token");
131
149
const response =awaitclient.sendRequest("GET", {
132
150
url: "https://httpbin.co/anything",
133
151
forwardHeaders: {
@@ -143,7 +161,9 @@ console.log(response);
143
161
The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
144
162
145
163
```typescript
146
-
const client =newScrapeDo("example_token");
164
+
const { ScrapeDo } =require("@scrape-do/client");
165
+
166
+
const client =newScrapeDo("your_api_token");
147
167
const response =awaitclient.sendRequest("GET", {
148
168
url: "https://httpbin.co/anything",
149
169
render: true,
@@ -168,7 +188,9 @@ Key information retrieved:
168
188
> For security reasons, you can send up to 10 requests per minute to this endpoint. If you exceed this rate, you will receive a 429 Too Many Requests error.
169
189
170
190
```typescript
171
-
const client =newScrapeDo("example_token");
191
+
const { ScrapeDo } =require("@scrape-do/client");
192
+
193
+
const client =newScrapeDo("your_api_token");
172
194
const stats =awaitclient.statistics();
173
195
174
196
console.log(stats);
@@ -184,7 +206,9 @@ In this example, multiple parameters are combined to showcase advanced scraping
184
206
-[playWithBrowser](https://scrape.do/documentation/#play-with-browser?utm_source=github&utm_medium=node-client): Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
185
207
186
208
```typescript
187
-
const client =newScrapeDo("example_token");
209
+
const { ScrapeDo } =require("@scrape-do/client");
210
+
211
+
const client =newScrapeDo("your_api_token");
188
212
const response =awaitclient.sendRequest("GET", {
189
213
url: "https://example.com",
190
214
render: true,
@@ -215,4 +239,4 @@ console.log(response);
215
239
216
240
## Disclaimer
217
241
218
-
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
242
+
#### Any damages arising from the use of the library or service or any other legal situation cannot be associated with the scrape.do legal entity and team. The responsibility lies entirely with the user.
0 commit comments