Creating Scrapper for Different Websites
Project Title: Creating Scrapper for Different Websites
I’m looking to create a sort of “scrapper”, here’s what I want to do. I don’t need a lot of information scrapped, or at super-human speed, it can run all night for all I care.
Here’s what I need:
I need to have the ability to create “jobs” where each job will be executed at a predefined time, for specific websites, for a specific purpose.
I also need the script to allow me to check in intervals I define, for the same item and for multiple items at the same time as well.
I need the script to go on VK.com (Market Place), Facebook.com (Market Place), Ebay.com, Avito.ru, Offerup (It’s a mobile app), Aliexpress.com, Craigslist.com (I have already a script for that, since it’s rather complicated to search, my script can be run instead or converted to Python, it was created using Imacros), I need to be able to choose which platforms I want to search on and search for a specific keyword, put the results in an excel file (for example) then compare them.
My purpose with this script is to resell/flip, I want the script to check for me, using for example, whether an item is worth flipping if I buy for example, from Craigslists, and sell it on Ebay.
Each site will have to be scrapped separately, with it’s own script probably then compare all the info together to suggest the best place to buy and best place to resell the item bought for profit.
Initially, a PoC for Craigslists – Ebay, is good enough. Later on, we need to move on to different websites to create a totally automated system that check everyday, according to the intervals I’ve set and alert me when item scan be resold for profit.
This process can be done using: a website, which shows the AVERAGE SELLING PRICE and SELL THROUGH RATE, so if we, for example, search craigslist for items, then collect the data, compare it to checkaflip.com, we can get suggestions for flippable items.
The hard part is creating the scraping script, which is sort of done, if you can convert to python my Imacros script & also the script needs to be able to understand the website, so that it can work according to my predefined parameters, which are: profit margin % & sell through rate (which are both shown on a website).
1. Set multiple jobs(for multiple keywords) using multiple keywords
2. define when the jobs will start operating
3. define keywords to search for
4. define profit margin % & sell through rate (checkaflip.com shows all of this),
5. search craigslist using a script that avoids captchas & scraping protection (just need to convert imacros to python)
6. scrape all results, filter results according to defined parameters stated in clause 4,
7. show only relevant results
8. set this job to work everyday if needed, and provide constant updates and notifications automatically(for craigslist there is a simple solution, you can set craigslist to alert on new ads in specific locations for specific keywords),
9. compare data from other website the same way and create a network of notifications across all platforms.
For similar work requirements feel free to email us on email@example.com.
Can you build a tool to Scrape Resumes from http://www.monster.com? What would be price of the tool?
Can you provide us with API for https://www.copart.com/todaysAuction/ and https://www.iaai.com ? Replay ASAP.
Let me know if you guys can build a scraper to scrape jobs from http://www.indeed.com? Please send your price structure for the same.
Write a script to scrape resumes from http://www.ziprecruiter.com, I’ll provide you more details on call.
Need an expert to develop software to scrape car prices daily from http://www.autobidmaster.com, please advise.
Develop scraper to extract dental directories for Dentists List for the attached list of 15-20 websites.
Create a tool to scrape various Lawyers Directory, needs to extract the following data: Lawyers Name, Practice Name, Address, City, State, Phone, Email, Website URL, Practice Area, Education
Let me know if you can scrape jobs posted on jobserve.com and deliver output in CSV format.
We are looking for someone to daily scrape job listings from https://www.careerbuilder.com/.