Your objective is to collect the following details from a website and store these in csv file:
1. Full address (which can be found under the heading "How to find us")
2. Name of owner (which can be found under the heading "Owner Information")
3. Email address (which can be found under the heading "Owner Information")
4. Website url (which can be found under the heading "Owner Information")
Please note that items 3 and 4 are not present on all records.
To access the records from the website you must first enter a UK postcode. We will supply you with a complete list of UK postcodes to enter into the websites form.
Once you have entered in the postcode you will be presented with a list of results. These results contain an image. You need to click on the image to reveal the details we need you to collect.
There is approximately 10,000 details to collect. Please contact us for the website address.
how many total zip codes you have? could u please provide me URL where we need to enter zip code to obtain the data, will provide you sample atonce thx
59 freelancers are bidding on average £143 for this job
Hi sir, I'm professional web data extractor, would you like to show me the website URL? I can write a script to pull all lists down to excel or csv file, wait for your reply, thanks. Regards Nathan.
HI I have a good team. So I will be able to collect this 10k data easily in few days. Can you send me the web address and the zip code please.................................
Hi there, We are good in scraping, Can you send zip code and website detail to take a long. we can try find a easy way to get all the information. Awaiting your response. Thanks
Hi, How are you? I hope you are well. I understand that you need 10000 UK postal address email and website. I will finish your work with 100% accuracy and for 72 hours. I want to start from now. Thanks Bayzid Khan
Dear I am an expert in web scraping. In before, I developed many spiders using php, nodejs, scrapy. For example, I scraped data for CEOs(ebay, welivv, [login to view URL], [login to view URL] and so on). Thanks.