Real Estate Technology
Market News & Data
General Info
Real Estate Strategies
![](http://bpimg.biggerpockets.com/assets/forums/sponsors/hospitable-deef083b895516ce26951b0ca48cf8f170861d742d4a4cb6cf5d19396b5eaac6.png)
Landlording & Rental Properties
Real Estate Professionals
Financial, Tax, & Legal
![](http://bpimg.biggerpockets.com/assets/forums/sponsors/equity_trust-2bcce80d03411a9e99a3cbcf4201c034562e18a3fc6eecd3fd22ecd5350c3aa5.avif)
![](http://bpimg.biggerpockets.com/assets/forums/sponsors/equity_1031_exchange-96bbcda3f8ad2d724c0ac759709c7e295979badd52e428240d6eaad5c8eff385.avif)
Real Estate Classifieds
Reviews & Feedback
Updated about 2 years ago on . Most recent reply
![Kumar Tummalapalli's profile image](https://bpimg.biggerpockets.com/no_overlay/uploads/social_user/user_avatar/917613/1694629544-avatar-davidd366.jpg?twic=v1/output=image/cover=128x128&v=2)
County website Scraping
Hi There ,
I am about to get a website scraping tool built - but I am not too sure of all the things to consider here - I have given the basic requirement to the developer
for ex: the program built should be able to work with different sites - for ex one can be built on .net , the other on html - the software should be good enough to handle these variations.
If anybody has any pointers , that would help - you can PM me as well .
Alternatively , if anyone knows of a good out of the box software that I can purchase or subscribe to - I can do that as well
Thanks,
David
Most Popular Reply
![Bryce DeCora's profile image](https://bpimg.biggerpockets.com/no_overlay/uploads/social_user/user_avatar/719938/1621496051-avatar-bryced8.jpg?twic=v1/output=image/crop=365x365@185x201/cover=128x128&v=2)
@David Des I’ve built a scraper for my county site (Snohomish County in Washington). Here’s how I use this:
I get my lists from title, public disclosure survives, MLS, driving for dollars, etc. I keep all of the parcel numbers for those properties. I can then upload the CSV file. An API endpoint at AWS is set up with python to look at each parcel number, go to the county site for that parcel, scrape the HTML for all data, then push that data to my Podio database. This happens to work amazingly well for my area because the county site is as such:
www.countysite.sample/?=12345 (12345 would be the parcel number)
For more advanced sites you could still use a tool like Python’s Senium to actually click through and submit web forms to get to the data you need.
Doing this is extremely powerful. I’m able to get a list of a couple thousand properties and have them all imported into my CRM within an hour... including owner name, owner address, back taxes due, sales history, bedrooms, bathrooms, images, floor plans, etc.If you need guidance on how to spec this out to a developer just shoot me a message.