Skip to content

Easily scrape your proxies from your favorite Websites

Notifications You must be signed in to change notification settings

NeutronX-dev/nex-scraper

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

nex-scraper

Simple proxy scraper by NeutronX (yaxeldragon)

Used by:

MoreBots.OVH - Agar.io Bots.

Terms

When downloading this module/script you accept that

The user/company should use it under his/her/their own responsability.

How it Works

Once you input the URLs you want to look in, It will send a request, check if each proxy is an actual proxy using regex. Then It will give you access to all proxies in an array from an async funcion.

Features

  • Custom URL Scrape.
  • Filter from an HTML document.
  • Check proxies with Regex.
  • Remove all Duplicates.
  • Easily access to all Proxies.
Area Quick Description
Get Started How to start using nex-scraper
How to Scrape How to get the Proxies.
Configuration Customize nex-scraper.
Versions See what's new.

Get Started

To install nex-scraper in your project you need to input the following in your console:

npm install nex-scraper

Once nex-scraper is installed you can import it in your project by using the following methods:

const proxy = require('nex-scraper');
const proxies = new proxy([
    '(input URL here)',
    '(another URL here...)'
]);

You can do as many objects as you want with the URLs you want.

How to Scrape

your project by using the following methods:

const proxy = require('nex-scraper');
const proxies = new proxy([
    'http://www.live-socks.net/2020/05/28-05-20-socks-5-servers.html',
    // ^^^^^^^ This Is HTML (It will get filtered)
    'https://api.proxyscrape.com/?request=getproxies&proxytype=socks5'
    // ^^^^^^^ This Is a normal API (RAW text)
]);
proxies.scrape().then(proxies => {
    // Here you can access the Proxies
});

Configuration

Here you can decide weather you remove duplicates or not, if you want to it to filter something in specific, etc...

// All configurations:
{
    removeDuplicates: true,    // bool
    filter: 'XXX.XXX.XXX:XXXX' // string
}

How to enable/disable removing Duplicates.

Customise what to Filter.

forEach proxy.

Remove Duplicates

const proxy = require('nex-scraper');
const proxies = new proxy([
    'URLs...'
]);
proxies.setConfig({
    removeDuplicates: true, // Default: true
});

This will NOT remove ANY duplicates, so you may get a same proxy more than one time.

Custom Filter

const proxy = require('nex-scraper');
const proxies = new proxy([
    'URLs...'
]);
proxies.setConfig({
    filter: "XXX.XXX.XXX.XXX:XXXX", // Default: true
});

PD: The default filter has a longer range than anything customizable.

This will ONLY return proxies that looks like XXX.XXX.XXX.XXX:XXXX. For example: XXX.XXX.XXX.XXX:XXXX to look like => 123.456.789.123:4567.

More examples:

// -------------------------------------
proxies.setConfig({
    filter:  "XXX.XXX.XXX.XXX:XXXX",
    // Look: "123.456.789.123:4567"
});
// -------------------------------------
proxies.setConfig({
    filter:  "XX.XXX.XXX.XX:XXXX",
    // Look: "12.345.678.91:2345"
});
// -------------------------------------

forEach Callback

const proxy = require('nex-scraper');
const proxies = new proxy([
    'URLs...'
]);
proxies.setConfig({
    foreach: (proxy) => {
        console.log(`Gotten ${proxy}`);
    }
});

It will run the gotten callback for each proxy it gets.

Versions

Happy Coding!

About

Easily scrape your proxies from your favorite Websites

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published