Advice for New Real Estate Apps

3 minute read

Recently, I had the joy of rebuilding the property exports for a new property listing system. This involved the tedious task of creating xmls for all the property data feeds to various real estate portals in Australia and across the World. With over 30 portals and some requiring a separate and unique xml specification, caused quite a headache for our developers.

There were 4 recurring issues which arose during this process and should be avoided by any new real estate application or real estate portal. These include; creating your own unique xml specification, requiring 3rd parties to sign confidentiality agreements, providing inadequate documentation and lastly, failure to provide upload confirmation reports.

  • Unique XML Specification – This has been a common mistake by new portals as they think they need to have their own unique xml feed specification. Although REA do not permit it, it is common knowledge that the REAXML is industry standard which is used/accepted by many existing portals (names withheld). Like any other developer we can set up an REAXML to a new portal in around 5 minutes. However, if a portal has their own unique XML then it could take days to write the XML and then days to test it.
  • Confidentiality Agreements – I’m bewildered why I’m required to sign a confidentiality agreement when all we are doing is sending an XML to a third party website. I can understand signing a Terms & Conditions for uploading agreement, but anything else is just a speed bump in the process to getting an XML in place.
  • Inadequate Documentation – Those portals who have their own XML specification need adequate and UP-TO-DATE documentation so developers can efficiently create the XML. Some portals do have excellent documentation while others have shocking documentation which is often out-of-date. IMPORTANT – Make sure your XML documentation is structured so it is easy to navigate and is up-to-date.
  • Upload Confirmation Reports – This is probably the most important aspect of running an efficient export/import system. These reports are sent back to the bulk loader after each XML has been parsed by the 3rd party portal. The reports highlight which properties have been added, updated or removed along with which properties had errors and what they are. Once again, REA have an excellent report system where they tell you the exact error for why a property was not added. As a further benefit, they also send a daily email to the real estate agency summing up the properties parsed for the day.

In my opinion the worst portals to export to are those controlled by the Real Estate Institutes. For some reason they believe having their own XML specification provides greater value to members. Most smaller portals accept the REAXML but only a few provide adequate upload confirmation reports which makes it hard to monitor the flow of properties.

For any new real applications it is important to reduce the barriers which allow them to receive property listings. There is no reason to “reinvent the wheel”, simply follow industry standard and use a system which is common practice.

Tell us if you liked this content.
Show CommentsClose Comments


  • Charlie
    Posted March 12, 2010 at 2:19 am 0Likes

    Ryan, 100% agree with every word.

    Feeding is the bane of my life, but so important as a service (agents want and deserve, understandably, ‘single data entry’). What gets me is when feeds are in place, been working fine for years, then something weird happens – a feed file is ‘lost’, an email server at the receiver end fails, internet connection drops out during a feed and the feed stops, the received changes something without telling you, etc etc – if it can go wrong, it usually does! (‘Alamak’ as they say in Malaysia.)

  • Shane Dale
    Posted March 12, 2010 at 2:23 am 0Likes

    Ryan – you got it easy mate! try doing that in 2001 – everyone was just learning how to actually get property transfers to work, and only accepted csv! REA were just creating their fledgling XML format. I was running websites loading to them. They were interesting times, welcome to the XML circus.

    Its also just as much fun now as a portal ( the other side of the coin) receiving what loaders describe as valid REAXML – myhome receives from over 50 different loaders ( some with no property data experience) – so you can imagine what is received. But it works, mostly.

    Despite having common XML tags –
    the way the changes are used affects it greatly – such as how sold is sent or not, and how to handle duplicate listings and multi suburb listings and so forth – what is allowed to be displayed and the various pricing options such as the domain versus REA pricing and searchable field formats. They have different interpreations of how to use the data.

    I think your points are very valid. It was worse when some groups send entire dumps of ALL listings every single day instead of just sending “updates or changes”. CSV formats were the worst – now defunct I think.

    The other bogie is adding the images into the feed – a url to the image is far superior, which allows the receiving server to “grab” the image when processor time or bandwidth suits.

    Another good feature set would be – delete and resend full agency listings – to get a “clean” listing feed whenever there has been a glitch or a server outage from either end, or the host of usual and unforesen elements of running data systems. This resend should be an agreed process between parties, not an ad hoc proces I think. Its a fact of life that no system is perfect so we should all design to have recovery plans for normal issues. This is what would allow the entire industry to offer better services to the agents, our clients.

  • Andy
    Posted March 12, 2010 at 2:38 am 0Likes

    Thankyou Ryan,
    Very timely information for us here.
    we look forward to the weird things happening too, that’s life!

  • James
    Posted March 12, 2010 at 3:20 am 0Likes


    I think you need to understand that not all systems are designed to do the same thing and the REA and Domain specs are designed to feed a portal and nothing else.

    You raise the systems of some of the institutes and they are different because they are capturing much more information and serving a very different purpose. The information contained in the REA XML specification as an example is only a small part of what these databases and systems require to populate a web portal and the many other systems or services that they provide.

    The last time I enquired I was also of the understanding the REA XML Specification was copyright and limited to the use of uploading to REA.

  • Nick
    Posted March 12, 2010 at 3:34 am 0Likes

    Interesting timing as I’m about to start the process of rewriting our systems. 🙂
    I’ll be using fun things such as Gearman to make the process incredibly fast. It will be completely modular as well with input and output modules.

    One thing I dont understand is why on earth is most processes done via FTP?
    Why not SFTP using SSH Keys for verification or even something that would allow instantaneous reaction times like REST or POSTing the XML file.

    And why not use gzip? XML compresses very well. For large feeds it could make a difference.

    To beat the sync problems, I’m opting for a full history system similar to a wiki which allows me to request a list of changes from any arbitrary time. If a portal stops processing for awhile then my system can tell when the last successful push was and send from that point in time.

    I’m also interested in exploring a autoconfiguration system which would allow a agent to start pushing within minutes without mucking around. Think OpenID or OAuth.
    That would require cooperation from many people however to be effective.

    By the way Ryan, I havent seen much in the way of specs for ZooProperty’s xml format. Just a very long list of XML tags.

  • Shane Dale
    Posted March 12, 2010 at 10:28 am 0Likes

    Nick, some good ideas there – however as we all know REA xml is a sort of standard and allows things to work, however imperfectly. XML is at least robust and hard to stuff it up, being flexible in its tag structure.

    You answered your own point “That would require cooperation from many people however to be effective.” to make anything new. Hmmm, well not holding my breath for that to happen in this industry!

  • Ryan O'Grady
    Posted March 12, 2010 at 10:12 pm 0Likes

    James, we’re stuck between a rock and a hard place here. It can be argued that we should be providing property seekers with as much information as possible. However, from a bulk unloaders perspective it causes huge issues not only in creating the xml but also adjustments to our system.

    If each portal had their own unique xml then it would mean the bulk uploader would need to adjust their property listing interface to account for all of the tags in the xml. When we’re exporting to 30+ portals each with their own specification, then the property listing template in our system would have 200 + fields. Many agents would look at that listing template and dismiss a large number of the fields as they would only use the fields which they normally fill in for REA or Domain. Those agents who do fill all or most of the data fields will then be asking why this data does not appear on REA or Domain.

    What most if not all property listing systems do is design their property listing templates to include the same fields which the major portals (Domain and REA) do. This often means that many tags found in a unique xml (like Realestateview) are omitted from the xml sent. We’re building a small portal at the moment and the property information displayed is the same as the information found in a Domain or REA xml, meaning the transfer of property data from agents to the portal is quick and simple.

  • Ryan O'Grady
    Posted March 12, 2010 at 10:30 pm 0Likes


    We can’t even develop an industry standard xml so there’s no chance of developing an alternative. It will start with the portals pushing out an alternative and then the bulk unloaders will follow suit.

    The Zoo Property xml……we accept feeds from HubOnline and a few other bulk uploaders so at this stage we parse the format they send us. Those tags in question relate to the Zoo Property API.

  • Sam Hutton
    Posted August 12, 2010 at 9:45 pm 0Likes

    For the portal that we are creating, we are planning to only support the reaxml specification. Is the the same as the reaxml?

  • website design gold coast
    Posted November 14, 2010 at 6:56 pm 0Likes

    guys do you know any reaxml import script ???

  • Ryan O'Grady
    Posted November 15, 2010 at 6:40 am 0Likes

    There are no specific scripts on the market because the import script needs to sync with your database structure. Also, you will then need to negotiate with all of the real estate software providers in Australia and convince them to set a feed up to you. This system already accepts feeds from all of these providers so you can use this to import the properties and then through plugins display the listings on the frontend.

  • Bill Shields
    Posted January 30, 2011 at 11:18 am 0Likes

    Hi Ryan,

    We have an XML Processor (importer) that accepts the defacto standard REAXML. We built it for ourselves, but of a commercial grade and we have sold it as a complete XML solution for people wanting all the XML and database stuff looked after for them.

    It comes with SQL structures, image processing libraries, and a complete management system with reply emails (like REA). Adding new uploaders is a 2 minute job.

    Other than that I agree wholeheartedly with your comments, especially new portals that think they should have their own and often poorly considered XML standard. Just makes work for us, especially when it doesn’t work properly.

  • Jay
    Posted June 25, 2014 at 7:17 pm 0Likes

    I can’t believe how much some places charge for setting up an REAXML to a new portal in around 5 minutes. Don’t they want their data to be in more places?

Leave a comment

3 minute read
NetPoint Group