Developing Web Standards for Property Identification

Over the past few weeks I have had a number of phone calls from irate agents telling me ‘did you send our listings to MyHome?’ My answer was an emphatic NO. The biggest problems they are facing are old listings. This comes down to a poor understanding of agents websites. Many agents still have their listings ‘live’ long after these properties are sold, even if you cannot view them from the main page. This is mainly because of the way sites are setup and their is absolutely nothing wrong with this.

I will be adding MyHome soon as I do think it has great potential that will not be realised for some time. You will see the free offer extended for at least another six months (I suggest at least a year) because once payment time comes agents will desert the site if it is not giving them any return. I doubt many agents will fund a site for the long term, it is up to MyHome to provide this return and I am sure agents would be only to happy to pay a reasonable fee once this occurs.

Update your Terms
So this leads me to scraping. It is wrong – however it is not illegal (to an extent) but agents have to update their Terms of Use and make sure that they make it clear that permission must be obtained in writing if any of the data on their site is to be used on any other site. Then join a group of agents and take legal action against any site that does not conform.

Get you data in XML
By creating a XML data specification for all the industry is important and REA cannot do this alone, they may want to but they will only cover data they accept. For this to work and for a person like me to even begin to accept this they have to get together with all of the major portals and other players in the industry and fund the development of an industry accepted data specification for identifying property information. This specification has to be 100% open so that when Joe Blow Real Estate wants to get their website developed by Cousin Michael/Micheala he/her can get access to this specification and develop away. I would never accept a specification that was closed and REA, Domain have to show some good faith here. I would understand commercially why they would do it, but morally it will be wrong to close it off to only the big boys.


Access to this Data

What they should be done is a way we can all protect this data from scrapers. Just because it is wrong to do it, does not mean that they will not do it. The first thing I noticed about the REA Council was that they were going to explore this avenue and I absolutely applaud this endeavour. But open it up, closing it off will be going down the lines of the music industry, so make sure it is open, I will be the first to volunteer my time to make sure this is done correctly and if it means at my own cost flying interstate I will do it, because I think it is an important step forward.

Updating
I think also this should be a specification that should be developed for the world and we could be a leader in this. We should also have 3 monthly reviews where this is updated to accept any new technologies that become available. We have specifications for site maps, web standards so why not real estate property listings?

You thoughts….

SEO For Real Estate
Listing Leads
Agentpoint Real estate

About Peter J Ricci

Peter Ricci is the Director of Agentpoint.com.au, Business2.com.au, Ginga.com.au and ZooProperty.com and has been involved in designing and developing real estate systems and websites since 1997. In July 2001 Peter founded Business2.com.au to help real estate agents better understand the power of the Internet and the real estate landscape in Australia and New Zealand. Since then he has penned over 300 articles on a variety of subjects in the real estate technology industry. Business2.com.au is now the leading real estate technology site in Australasia.

21 Responses to Developing Web Standards for Property Identification

  1. Glen Barnes March 27, 2007 at 9:33 am #

    Have you checked out RETS? I’m no expert on the matter but it looks as though this might be going down that track. Also implementing the addresses as Microformats would be a good idea.

  2. John Dedes(Land Agent) March 27, 2007 at 1:11 pm #

    Full Address and Price listing on every property please from Agents for starters!!!!!!!!!!!!!! Before it is even posted on the web say on REA!

  3. Anthony March 27, 2007 at 2:30 pm #

    ummm – you may want to assess the robots.txt file of your website

    This basically defines what you allow robots, spiders etc to do on your site. By stipulating you require written permission, you are saying Google Yahoo etc etc can not spider your site. Ahhhh but you want them to look at your site, just not abc Backdoor Portal. Well put it in your robots.txt file.

    http://www.domain.com.au/robots.txt
    http://www.realestate.com.au/robots.txt

    If you compare the following two sites for their robot file – Domain is very specific whilst REA effectively does not disallow any spider etc

    http://www.robotstxt.org/wc/norobots.html

    As for scraping – as i understand it – it is not illegal – but i am now well aware that it isn’t considered good etiquette.

    But who owns the copyright for a properties details

    The owner, The portal or the agent.

    I believe the answer changes based on whom is being active. If i scrape REA and use the data – they will claim copyright – if another agent copies another agents listing off REA – they will claim it’s the agents copyright and not take legal action? ( I actually have the emails and correspondence for this issue).

    As for taking legal action – dangerous turf Peter. My advice received, and it ranges from Professors at ANU to solicitors and barristers (effective as of July last year) is that they don’t know which way it would go and depends on how the data is framed. If you pretend ownership of the data then you may lose, but if you say it’s not your data and refer back to the originating URL then you can be seen to be imitating a search engine. No legal precedents that i am aware of in Australia to date (last July).

  4. Glenn March 27, 2007 at 2:37 pm #

    Peter,

    In the rant room thread I posted a link to the The Centre of Realtor Technology run by the NAR. If you have not checked it out already I think you should. I think you will find that they already have an XML standard for the web that is used right throughout the US. Clearly with their endless array of MLS systems swapping data with the agencies they needed an official standard in place.

    I dont know if it could be applied over here, but such a specification could at least be a great starting point for Australia. This has to be put in place by the REIA and endorsed by the state institutes. Get REA and Domain and anyone else to make submissions all means, but somehow the industry has to insist that they they follow the specification that is best for the industry, not for their commercial requirements.

    I think this specification has to allow for a common property id so we as agents have only one id to work with across all portals. The can cross reference their own code by all means, but the primary id is the one everything is synced by and is the one given in property enquiries and the like.

  5. Anthony March 27, 2007 at 2:46 pm #

    Love the ID suggestion

  6. Glenn March 27, 2007 at 3:53 pm #

    I dont think there is any real question on copyright is there.. ?

    Facts like how many bedrooms or big the land size is not subject to copyright. Imagine if that was not the case… I will copyright the length of the harbour bridge and the dimensions of ayres rock… so whoever prints those figures has breached my copyright and you can bet I would sue for damages πŸ™‚

    Photos are copyright to the person who took the photograph unless they have waived/sold their rights to the photo. Creative text as in headings and description is copyright again to the writer.

    I dont think the comparison to a search is fair either. Search engines dont display all or the majority of the content of somebody elses page. I dont see how Providing a link back to the source information would absolve you from the provisions of the Act. If you reproduce somebody elses copyright data other than for review purposes then your in trouble.

    I guess that means that if the agent gives the authority to one portal, it has the right to scrap the agents copyright material off the other. The only exception maybe the terms and conditions of the site…. but that is less likely to stand as the portal is making the agents copyright information publicly available. Sort of like putting a sign up on the road but in writing at the bottom of the sign stating only people over 18 are allowed to read the sign.

  7. Robert Simeon March 27, 2007 at 4:08 pm #

    Personally – I see the copyright issues as a “storm in a tea cup”. If one feels that strongly then they should watermark each photograph and we all know how user friendly that would be. It is a simple matter of getting ones business compliant with Google (which we are). I just looked at Ranking Report Analysis for last month and I see that Google Australia delivered 32 per cent of our total traffic in February.

  8. Glenn March 27, 2007 at 6:17 pm #

    We are around the same… ranges from 30 to 40% any given month… What is interesting though is I have goals setup in Google Analytics for viewing of property for sale (and other goals for rentals etc etc) so I can look at the quality of the links coming from different sources, geographic locations and even search engine keywords. Googles referral quality and pageviews per visit is always very high in comparison to other search engines and directories.

    If you are using Google Analytics and have not installed goals on your site yet, what are you waiting for. This is the most powerful feature in the whole program when setup correctly. This allows you to look at the quality of all of your other statistics..

    Example 1:
    Referrer 1 sends you 150 visitors last month and 20% of them viewed property for sale and 10% viewed rentals.

    Compare this with Referrer 2 who only sent you 100 visitors but 70% viewed property for sale and 20% viewed rentals.

    The goals factor shows you that referrer 2 is clearly a better referrer for you than referrer 1… Which site would you pay for banner ads on???

    Example 2:
    Your SEO charges you $800 for optimising certain pages with keywords they select on your site for $500. Traffic starts arriving a couple of months later and at first glance the traffic appears good BUT the goals in your Google Analytics account shows you that under 5% of that traffic actually views property for sale. That was a huge waste of $500.

    Applying quality analysis of statistics makes all the difference. This is why I asked Dave for further stats than what he was boasting so we could see how good they really were. To date he still has not provided any of the requested information despite initially indicating he was able to do so… Maybe the further information showed that the statistics he quoted did not stand up well to further analysis and scrutiny.

  9. Robert Simeon March 27, 2007 at 7:34 pm #

    Thanks Glenn,

    You add great value to real estate agents who closely read your most informative commentaries. I am in fast pursuit od Google Analytics as this is obviously an awesome online tool. How quickly the online landscape can change with the news of the breaking news of Google Maps and Google Base.

    One can only ponder and wonder if the Google Real Estate Portal is launched here in Australia before the United States. It makes plenty of sense.

    Quite amazing that Google also deliver your business traffic between 30 – 40 per cent each month too. For free too πŸ˜‰

  10. Anthony March 27, 2007 at 9:26 pm #

    Autralia – to google is used regularly as a test site – we only represent 200 million or so of their total 12 billion revenue – yet we display many similarities to their larger markets.

    I reckon they may just test the waters here first – assess exactly the impact to their traditional adword customers and whether they offset that revenue with new business.

  11. Robert Simeon March 27, 2007 at 10:00 pm #

    Anthony,

    My mail is that testing here in Australia will be a fait accompli. It just not the bloggers here but the powers that be – who share a view that Australia would be the perfect test case for the obvious reasons that you state.

    Oh – and who said that … “it can only happen in America”. Should they launch in Australia and my understanding is that they will as our property markets deliver a great platform.

    Watch Fairfax counter with their multiple domains in print and online. Already the odd couple being REA and Cumberland have formed a new alliance with REA mast heading the Cumberland Newspapers real estate sections.

    When Google finally lay their cards on the table they will immediately assume top positioning in online property domination. Just as interesting will be how the Aussies counter attack the most formidable online player on the planet.

    As I alluded some weeks ago we won’t be renewing contracts as the Google launch has the potential and reality to completely change the Australian online landscape. What surprises me most is that Google appear to be fast tracking their entry point. The recent release have many spell bound – and all with minimal attention.

    Which Google are past masters of stealth online releases. One could only assume with the release of Google Base that they are telling Fairfax Digital and REA – that they mean business with an insatiable appetite.

  12. Glenn March 27, 2007 at 11:18 pm #

    Robert..

    Thanks for you comments..

    You will love Analytics. Just like anything spend a bit of time and set it up properly and it will reward you with a wealth of information to tweak your internet presence for even better results.

    Make sure you set up your goals and your campaigns. I discussed goals earlier but campaigns allow you to customise links back to your website into individual campaigns. This allows you to start analyzing specific campaigns and collective campaigns using the same medium.

    This site http://www.google.com/support/urchin45/bin/answer.py?answer=28690 allows you to generate custom URL links back to your website to utilise the Google Analytics campaign feature. Have a read and you should be able to get a better picture of what is going on. Urchin was the name of the product before Google bought it out. Much like Google Earth, it was a great product available for sale but with little market share until Google bought them out and roll out the base product for free and then charge for the highend or premium versions…

    You will have links back to your website in your email signature, email newsletter, email property alerts, email property management owners newsletter, REA top banner, REA left side banner, REA Agent Directory, Domain, Franchise Head office directory etc etc etc..

    Each of these links is unique and this will allow you to get statistics on the medium such as email the source, such as REA or the individual campaign such as traffic generated from the link from your salesmans email stationary. You can even tag different content for the same campaign such as a graphical animated “click here” on that email stationary or the normal website button.

    So with this you could work out the quality of different banner ads, tweak different email newsletter till you get your highest quality responses etc etc etc..

    Most people who get Analytics dont setup these features and just use the program for basic webstats, but these are what gives GA its power. The other advanced feature of use is the cross segment performance. This allows you to look at statistics for a subset of the data.

    As an example this is what I used to produce create my broadband percentages for different countries posted in another thread. The standard stats might be 12% of visitors are from NZ, and 75% of all visitors use broadband, but cross segment performance allows you to see that of the NZ visitors (12% of total) only 35% connected at broadband speed. (figures made up for the example) You of course could use CSP for anything, including campaigns.. So you could see what keywords people from Canada used that were referred by a search engine over the past 12 days and know what percentage of these visitors actualy viewed for sale property, rentals, and say looked at your company brochure… Sound Powerful enough for you ??? Add on to that stats that are only around 2-3 ours behind actual time.

    Unlike Campaigns and Goals, you dont need to set anything up for CSP. You just need to understand what it is saying and how to interpret that information.

    BEST OF ALL IT IS FREE!!!!

    Let me know what you think when you get it up and running..

  13. Peter Ricci March 27, 2007 at 11:25 pm #

    Glenn, I have been using Analytics for about 3 years now and it is amazing. It is one of those products that gives the power to someone who is interested enough like yourself to explore. There ia a pretty good book that you can buy from Amazon.

    Google Analytics
    by Mary E. Tyler (Author), Jerri Ledford

  14. Peter Ricci March 27, 2007 at 11:27 pm #

    Glen Barnes, yes, Microformats is a great start for anyone and thanks for the tips. My it is a big document.

  15. Peter Ricci March 27, 2007 at 11:28 pm #

    John (Land)

    I think you want addresses displayed….I think all buyers do!

  16. Peter Ricci March 27, 2007 at 11:29 pm #

    The reason I think Google would see Australia as a good testing ground is that the US is a bit of a funny market. Australia would be perfect….I am having a meeting with some people on Friday which should be interesting…..cannot say much, but they do read this site!

  17. Peter Ricci March 27, 2007 at 11:33 pm #

    Glenn, yes natural copyright belongs to the creator of the content, be it photos, movies etc unless otherwise agreed. Many agents get themselves into trouble with website design and professional photography because they only keep the finished product, I have written about this also. I wrote an article about Google Analytics a year ago and I would have thought most agents would have it by now.

  18. Glenn March 27, 2007 at 11:41 pm #

    Peter,

    So you were using the Urchin product before Google rolled out GA in November 2005…

    Tell me, did Google add much to that release, or was just a rebrand for the same version Urchin had? You are the only person who I have met (well, as far as online goes of course) that used the Urchin Product. A great product, but they did not have the clout and vision of a company like Google so they got swallowed up

  19. Glenn March 27, 2007 at 11:44 pm #

    In my experience GA has about a 10% saturation in the industry, and I have only met one person that had it setup properly. The majority dont use anything!!

  20. Peter March 27, 2007 at 11:59 pm #

    Yes I know, I know agents that have come to me (and big ones) that have never checked their software. Problem is allot of web developers do not care and provide software which is ridiculouisly hard for any user to understand.

    One potential customer quoted on TV his site was getting a 100 thousand visitors a month. I called him after watching his ad and explained to him how things work. Thankfully no-one uses that word ‘hits’ much these days, well except for some so called journalists.

    My hosting company had Urchin, it has not really changed the much except for integration with Google (Adwords) and it is much quicker now.

  21. Anthony March 28, 2007 at 12:41 pm #

    GA also allows cross channel

    I reckon its great viewing your Yahoo activity thru the GA tool.

    I have been a convert since late last year – but still have so much to learn πŸ™‚

Leave a Reply