Skip to content
×
PRO
Pro Members Get Full Access!
Get off the sidelines and take action in real estate investing with BiggerPockets Pro. Our comprehensive suite of tools and resources minimize mistakes, support informed decisions, and propel you to success.
Advanced networking features
Market and Deal Finder tools
Property analysis calculators
Landlord Command Center
$0
TODAY
$69.00/month when billed monthly.
$32.50/month when billed annually.
7 day free trial. Cancel anytime
Already a Pro Member? Sign in here
Pick markets, find deals, analyze and manage properties. Try BiggerPockets PRO.
x
All Forum Categories
All Forum Categories
Followed Discussions
Followed Categories
Followed People
Followed Locations
Market News & Data
General Info
Real Estate Strategies
Landlording & Rental Properties
Real Estate Professionals
Financial, Tax, & Legal
Real Estate Classifieds
Reviews & Feedback

All Forum Posts by: Account Closed

Account Closed has started 4 posts and replied 38 times.

Post: St.Paul landlord list for download, for a limited time

Account ClosedPosted
  • Posts 42
  • Votes 33

Here's St.Paul's rental/landlord list, similar to the one for Minneapolis.

This was updated 01/27/2020.

This share link will only be active until Monday morning.

For those that ask 'why' I take these down after a time period, it's because people will contact me 'weeks/months' later, asking why a owner name has changed, if I leave them online. And well, 'sales' happen.

What's important here, is the (PID=PIN) column 'O' is what you can use to match against the county file in the future, for your own updates to owner information.

Enjoy. Spring is coming. Time to build that pipeline.

https://docs.google.com/spread...

Post: Minneapolis has made it easier to get the violation's data now

Account ClosedPosted
  • Posts 42
  • Votes 33

@Kim Feldkamp Thank ya ma'am. I just believe in 'passing it forward'.

If you are in St.Paul, take a look at my last upload.....

Here's a file from the end of Dec that I thought I was going to try to cold call more. It's the water shutoff notices from Jan 2019 - Dec 2019. I thought I was going to concentrate on it more, but.... I just received another file more interesting for MPLS that I am going to focus on for cold calling.

This is 'good' as of the end of Dec. You will see duplicates in this for the same address/PID that are getting multiple shut off notices through the year....

Columns 'B' and 'C' are the notice type, and date sent. The 'shut off' notices, are the actual notice of them being kicked out of the home with a Category II condemnation.

Personally, I do not focus on Cat 2, I want the dates of the notice to model with before it reaches Cat 2. But a lead is a lead, so I track everything.

:) this is like throwing chum to a school of sharks, considering some of the people logged in here and the marketing budgets they have access to.

https://docs.google.com/spread...

The 'share' link will be good until Sunday.

Post: Minneapolis has made it easier to get the violation's data now

Account ClosedPosted
  • Posts 42
  • Votes 33

@Account Closed 

'you weren't kidding on the slow load' - Riiight? It's swell they are trying to organize more data like that... But that's just too much for Tableau. The'map'tab that they included was neat too, in concept to visualize it all... but useless when using it.  


'are there more of these nifty tableaus that Minnesota offers?'- Not really that I know of, and I'm always looking. From city to city, they all seem to just provide static Open Data portals. What one city releases, the next one will not. I've made myself a bit of a pest to the various gov workers in the 7 county metro... 'so hey, I have this file from xyz city, and I want the same from you'... with mixed results. St.Paul is the worst.

Minneapolis is the best with being transparent with data though. I've come to have a semi-direct line into the data/records people, and some of them are huuUUge data geeks. One of them sent me a Monte Carlo simulation using Python and Jupyter Notebooks back in Nov, using a data set for housing and it was a total 'are you kidding me? You're using Jupyter Notebooks inhouse and you are looking at it like this...' (People in RE need to start familiarizing themselves with Jupyter Lab/Notebooks. Jupyter makes Excel look like the old abacus counting frames, it's just a data welding/cutting torch)

Under the surface of all of this? Good things are coming. With the mass upzones of the 2040 plan, the city is trying to find ways to track/record every bit of housing data (so they can granulize all the fee's and penalties they'll use to zap multifam owners in a more efficient way). Hence all of the experimenting with the fun Open Source tools like Jupyter.

What are you looking for specifically (if you can say)? I might be able to point you in a direction.

@Tim Swierczek Enjoy sir!

Post: Minneapolis has made it easier to get the violation's data now

Account ClosedPosted
  • Posts 42
  • Votes 33

Minneapolis came up with a pretty cool Tableau dashboard back in September of last year. https://tableau.minneapolismn.... . It's a little slow loading, but it will load.

It lets you 'dial' in all of the violations, in really granular detail per property. Easy peasy to dial in things, and the download gives you a clean address list to import into your own Tableau install, and just 'list stack' against any other list(s) you might be collecting that has 'at least' an address. Video 10. Data Preparation – Joins and Unions https://public.tableau.com/en-... , makes this stupid simple. Hence, you can use the drop-down for rental license 'tiers' from the city Tableau dashboard, dial in the pain, and just load one of the sheets I posted a few days ago to match everything up against. It really is 'click the address column in the violations sheet, and click the column with the address in the landlord file' aaaand ♪presto♪: matched.

I dunno, I still think this is easier with PostgreSQL. Just load the list that you downloaded as a table into PostgreSQL, and just run a 'join' against the Assessor or landlord table that you would want to load in as well. Tableau is a bit of a resource hog on the CPU/RAM running joins... but it does make it just 'click-click=done' for stacking basically any list(s).

Post: Archive of rental investors in Minneapolis for download

Account ClosedPosted
  • Posts 42
  • Votes 33

@Jacob Johnson

I get basically 'every' sale statewide, with the contact info of parties involved... The 'front burner' with this is to match 'everything' like emails/phone #'s to the sales that might use a different entity name for the transactions, or even as simple as the same 'owner' address with different name/email/phone. What they're buying/selling. Mapping all of that is on the front burner.

What I'm in the weeds with/end goal is to accurately determine the 'space' available on a parcel for an ADU or carriage house. It can be done... I never cared for math in school,(NOW I CARE) but the more I dig in prospecting in 2020 going forward... the 'just go get a delinq tax/ect list' that every person tells every other person in real estate, is played out. Those lists/directions are beaten up, and it's a really small pond to fish in. (They still work... I found 3 last week off of matching water shut off notices against inventory not receiving mail... it's just you have to DIG with pain points) Today you have to be able to model everything around what the different cities are doing for housing plans...... Finding a person willing to sell, that the parcel has room for an ADU, while also falling within the different zones for the MPLS 2040 plan AND Opp Zones? That's a score for me. I know people that want that :), I have found some of them from my opening paragraph in this post.

This can be done though. It's just 'Geographically Weighted Regression'. You can run models against county parcel sets. The big commercial groups do it. It's really about finding really really clean parcel/property data. Little things like below/above ground sf can really throw things off too. I'm just using the basic parcel dimensions from various muni GIS files and Microsoft's awesome building footprint file https://github.com/Microsoft/U... for the building footprints on the parcels. I'm still way off on determining the difference right now unless I focus on more recent build years though.

I've only been playing with Machine Learning for the last 6 months or so, and it's really not an 'every day' thing to try to learn something new with it. 5-6 hours a week? It's just once you see that the best tools are Open Source and basically free, even when you plug into Cloud services, it's hard to not want to poke around and try to figure things out.

Post: Archive of rental investors in Minneapolis for download

Account ClosedPosted
  • Posts 42
  • Votes 33

I'm not really sure of the Milestone definition, as it looks like it conflicts with the license status from https://opendata.minneapolismn...  I did call them a couple of years ago, and did the ' and whats this column mean, and what about this one...' dealy-o. What I left with from that was for 'me' personally, I was interested in the tier status, and the number of units licensed. But that's why I uploaded the complete file history (minus the Geometry column info, because that would just create huuuuge sheets), because everyone has a different approach.

'Status' just looks like it denotes the landlord is paid up and active or not with the license.  Milestone seems to conflict that on some records, so it may be from the past. Milestone almost looks like a sort of 'float'.. ie: I am thinking some of it happened in the past, as a reference

Post: Archive of rental investors in Minneapolis for download

Account ClosedPosted
  • Posts 42
  • Votes 33

Here are the last 4 years of rental history licensing in Minneapolis. You start to see some interesting things when you 'join' this against the city Assessor file, in regards to how many actual units the Assessor is saying exists, and the number of units being licensed through the city.

It also provides a road map for violations, and the 'tiers' to work off of, changes of ownership... There's just a lot of directions to go with this.

for anyone starting to look at running Machine Learning against property/investor data... There is just a ton here for a Random Forest model when you join the Assessor file. (have fun dialing in those 'weights' for your attributes :) )

Goog was throwing me an error on the sheet sizes... not sure why, nothing should have went over 5 million cells. I had to create two tabbed sheets. They are tabbed at the bottom.

2016-17 tabbed license information

https://docs.google.com/spread...

2017-18 tabbed license information
https://docs.google.com/spread...

Enjoy. The sheets will be 'live' until Saturday morning, after that I am turning off the 'share' links, so download now if you want them.

Data history is everything.

Post: Non-WiFi Locks in STR

Account ClosedPosted
  • Posts 42
  • Votes 33

is it possible to connect everything to powerline networking adapters if possible? Or is it just there is not an internet connection?

Post: Anyone use APIs to get good information?

Account ClosedPosted
  • Posts 42
  • Votes 33

For those that may still be looking into this... First American and Corelogic are going to be the best bet for any data building models. That is, if you want 'accuracy'. They are the pretty much the main source fueling the rest of the aggregators, when 'quality' counts.

Corelogic prolly has the edge on anything pertaining to 'valuations' and AVM's...but First American is right there chin-to-chin with them. They both have been collecting property data since time immemorial. There is also a history of the two joining as one company, swapping data (swapping spit) and something will happen where First American will spin off on its own again. This has happened a couple of times.

Corelogic's tool 'Realist' is baked into 80% of the MLSs nationwide, providing the tax/sales/people data fueling the comps/CMA many realtors use for a quick look.

Attom bought Realty Trac. It's Largely welfare data from Corelogic https://www.ftc.gov/news-event... . They just sued Corelogic again, within the last two years or so, because the welfare data they get was largely crap, if that tells you anything of what was being sent to them. There was a distinct difference in hat Corelogic sent Attom under the court-ordered welfare agreement, and what Corelogic supplied directly to consumers. Buying from Corelogic direct is the way to go.

I will say Attom has upped its game in the last few years concerning ML and AI.... so who knows, they might actually have cleaner data now, and overall it might be cheaper to buy from them as a reseller.

First American are quality people, and they are scrappy... they have a huuuge database of scanned doc images that they have run ML against for the juicy tidbits of data a county/city does not collect or load onto public-facing .gov sites.

Most of the data you are looking for, like accurate monthly parcel/assessor/people ect data can be had just by googling your county name with 'open data GIS' in the search string. A modern county GIS dept will make a variety of datasets available as .csv, and some manner of GIS file like a .shapefile or KML, geofile ect. The bonus of that? It's a staggering amount of data, countywide, that is simple to load into a PostgreSQL-PostGIS install. It's stupid simple to do, and most realtors are oblivious that they can look at data on that scale, let alone outside of their MLS system. Go to your local county site, look for the nifty 'search for a property in the county' map, and look for the download button. If they do not have online, simply put in a Open Records Data request to the county.

For anyone looking to do this in Florida, MN, AZ, VA, or AL, https://propertykey.com/compan... is prolly your best bet... they're likely THE BEST dataset in use by realtors for maps/parcel/assessor/sales/people data itjustkeepsgoingonandon what they have. I could arrange an introduction with the team behind the data, I know them well, and they are just terrific people. Total 'makers' when it comes to property/GIS data for 25 years.... and most likely far less expensive as a reseller than some of the options out there.

MLS data? Scrape it. Scrape ALL of it. NAR and it's outdated approach to MLS data is OV/er https://www.eff.org/deeplinks/... . Just make sure to use rotate your Ip's on the regular, or set up some sort of proxy...

The County parcel datasets are the easy thing to obtain... It's the fresh sales data (what sold today/yesterday, to 'who', from 'who') is what's critical for any ML models you are creating. That's where the API will come into play for fresh sales.

Hope this helps~