Housing-News-Report-May-2018

HOUSINGNEWS REPORT

HOUSING PRECOGS: BIG DATA PREDICTIONS BEYOND HEURISTIC HUNCHES

2017 NEIGHBORHOOD HOUSING INDEX NEIGHBORHOOD HOUSING INDEX GRADE A B C D F

CLICK HERE TO VIEW INTERACTIVE VISUAL

look for owners that exhibit similar triggers to predict who is more likely to sell in the future.” Gupta added that “real estate is truly hyper-local, in that, the triggers that matter in a given neighborhood block can be different from the one next door, or even across the street. And these triggers can change from time to time even for the same neighborhood block. Hence, we have had to build hundreds of predictive models that look for various combinations of triggers to find the one that is the most accurate for each neighborhood across the country.” Personal Data Dossiers Back in 1971 — when many MLS brokers carried printed 3×5 cards to show inventory — the playwright Arthur Miller wrote that “too many information handlers seem to measure a man by the number of bits

“We use predictive analytics and machine learning to analyze how likely a homeowner is to sell in the near future. These techniques look at historical data — who has sold in the past — to identify, from several thousand data attributes, which ones may have been a factor in triggering those sales. And then, they look for owners that exhibit similar triggers to predict who is more likely to sell in the future. … Real estate is truly hyper-local, in that, the triggers that matter in a given neighborhood block can be different from the one next door, or even across the street.”

AVI GUPTA PRESIDENT & CEO, SMARTZIP ANALYTICS

of storage capacity his dossier will occupy.” Now such dossiers are far larger, vast electronic collections which detail our preferences in excruciating detail. Not just a tidbit here and there, but encyclopedic volumes of data ceaselessly gathered with clicks, links, cookies, tracking pixels, surveys, cell phone locators, loyalty programs, credit card purchases, and other collection

techniques. Companies, governments, and data brokers are accumulating unheard of volumes of data. Forget about gigabytes, petabytes, and exabytes. We’ve hit zettabytes — a measure equal to one trillion gigabytes.

“By 2025 the global datasphere will grow to 163 zettabytes,” says IDC.

2

MAY 2018 | ATTOM DATA SOLUTIONS

Made with FlippingBook Online newsletter