Skip to content

Commit 0baf6a1

Browse files
committed
chore: readme update
1 parent e7ccea8 commit 0baf6a1

File tree

3 files changed

+33
-835
lines changed

3 files changed

+33
-835
lines changed

.github/workflows/build-test.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -30,5 +30,5 @@ jobs:
3030

3131
- name: Run tests
3232
env:
33-
TOKEN: ${{ secrets.TOKEN }} # GitHub Secrets'ten tokeni okur
34-
run: yarn test # Test komutunu çalıştırır
33+
TOKEN: ${{ secrets.TOKEN }}
34+
run: yarn test

README.md

Lines changed: 28 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -23,71 +23,71 @@ npm build
2323

2424
## Example Usages
2525

26-
> ### extraHeaders
26+
### super (residental proxy)
2727

28-
#### extraHeaders is used when you want to add one or more headers specifically required by the target website, without altering the core headers automatically generated by the service. This is useful for passing additional information while maintaining the integrity of the existing request headers.
29-
30-
#### The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
28+
#### The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
3129

3230
```typescript
3331
const client = new ScrapeDo("example_token");
3432
const response = await client.sendRequest("GET", {
3533
url: "https://httpbin.co/anything",
36-
extraHeaders: {
37-
Key: "Value",
38-
},
34+
super: true,
3935
});
4036

4137
console.log(response.data);
4238
```
4339

44-
> ### forwardHeaders
40+
### customHeaders
4541

46-
#### The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
42+
#### The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
4743

4844
```typescript
4945
const client = new ScrapeDo("example_token");
5046
const response = await client.sendRequest("GET", {
5147
url: "https://httpbin.co/anything",
52-
forwardHeaders: {
48+
customHeaders: {
5349
Key: "Value",
5450
},
5551
});
5652

5753
console.log(response.data);
5854
```
5955

60-
> ### customHeaders
56+
### extraHeaders
6157

62-
#### The customHeaders option gives you full control over all headers sent to the target website. When you use customHeaders, the headers you provide will completely replace the default ones. This feature is useful when you need to define specific headers like User-Agent, Accept, Cookies, and more, ensuring that only your specified headers are sent with the request.
58+
#### extraHeaders is used when you want to add one or more headers specifically required by the target website, without altering the core headers automatically generated by the service. This is useful for passing additional information while maintaining the integrity of the existing request headers.
59+
60+
#### The following example returns the response of how you requested from httpbin.co. You should see the ‘Key’ header in the header section of the response.
6361

6462
```typescript
6563
const client = new ScrapeDo("example_token");
6664
const response = await client.sendRequest("GET", {
6765
url: "https://httpbin.co/anything",
68-
customHeaders: {
66+
extraHeaders: {
6967
Key: "Value",
7068
},
7169
});
7270

7371
console.log(response.data);
7472
```
7573

76-
> ### super (residental proxy)
74+
### forwardHeaders
7775

78-
#### The super parameter enables the use of a residential proxy for the request. When this parameter is set to true, the request will be routed through a residential IP address. This means that the IP address will typically appear as if it belongs to a mobile network provider, adding an additional layer of anonymity and making the request look more like regular web traffic.
76+
#### The forwardHeaders option is ideal when you want to forward your custom headers directly to the target website without any additional headers being generated or modified by the service. This approach makes the request appear as if it is being made directly from your end, preserving the original header structure.
7977

8078
```typescript
8179
const client = new ScrapeDo("example_token");
8280
const response = await client.sendRequest("GET", {
8381
url: "https://httpbin.co/anything",
84-
super: true,
82+
forwardHeaders: {
83+
Key: "Value",
84+
},
8585
});
8686

8787
console.log(response.data);
8888
```
8989

90-
> ### render (javascript execution - humanized browser rendering)
90+
### render (javascript execution - humanized browser rendering)
9191

9292
#### The render parameter allows for the execution of JavaScript during the request, enabling full browser-like rendering. When this parameter is set to true, the service will render the target webpage as if it were being loaded in a real browser, executing all JavaScript, loading dynamic content, and handling client-side interactions. This approach is particularly useful for scraping websites that rely heavily on JavaScript to display their content, providing a more accurate and “humanized” view of the page.
9393

@@ -101,14 +101,14 @@ const response = await client.sendRequest("GET", {
101101
console.log(response.data);
102102
```
103103

104-
> ### final bonus example (render, super, geoCode, playWithBrowser)
104+
### final bonus example (render, super, geoCode, playWithBrowser)
105105

106106
#### In this example, multiple parameters are combined to showcase advanced scraping capabilities. By using a combination of render, super, geoCode, and playWithBrowser, you can perform complex scraping tasks that require JavaScript execution, residential proxies, geographical targeting, and interactive browser actions:
107107

108-
- render: true: Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
109-
- super: true: Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
110-
- geoCode: "us": Targets a specific geographic location for the request, in this case, the United States. This is useful for scraping content that varies by region, such as localized prices or region-specific data.
111-
- playWithBrowser: Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
108+
- [render: true](https://scrape.do/documentation/#js-render?utm_source=github&utm_medium=node-client): Enables JavaScript execution to fully render the webpage, allowing for the scraping of dynamic content that relies on client-side scripting.
109+
- [super: true](https://scrape.do/documentation/#super-residential--mobile?utm_source=github&utm_medium=node-client): Utilizes a residential proxy, which makes the request appear as if it is coming from a typical user on a mobile network, providing enhanced anonymity and avoiding blocks from anti-scraping measures.
110+
- [geoCode](https://scrape.do/documentation/#geo-targeting?utm_source=github&utm_medium=node-client): "us": Targets a specific geographic location for the request, in this case, the United States. This is useful for scraping content that varies by region, such as localized prices or region-specific data.
111+
- [playWithBrowser](https://scrape.do/documentation/#play-with-browser?utm_source=github&utm_medium=node-client): Provides the ability to interact with the browser while rendering the page. For example, you can wait for specific elements to load or perform actions like clicking buttons. In this case, it waits for the <body> element to ensure the page is fully loaded before proceeding.
112112

113113
```typescript
114114
const client = new ScrapeDo("example_token");
@@ -128,11 +128,13 @@ const response = await client.sendRequest("GET", {
128128
console.log(response.data);
129129
```
130130

131-
## More details
132-
133-
#### [Documentation for more information](https://scrape.do/documentation/?utm_source=github&utm_medium=node-client)
131+
## Official links
134132

135-
#### [Scrape.do](https://scrape.do?utm_source=github&utm_medium=node-client)
133+
- [Scrape.do](https://scrape.do?utm_source=github&utm_medium=node-client)
134+
- [Documentation](https://scrape.do/documentation/?utm_source=github&utm_medium=node-client)
135+
- [Features](https://scrape.do/#features?utm_source=github&utm_medium=node-client)
136+
- [Blog](https://scrape.do/blog/?utm_source=github&utm_medium=node-client)
137+
- [LinkedIn](https://www.linkedin.com/company/scrape-do/)
136138

137139
## License
138140

0 commit comments

Comments
 (0)