Blog

Killing Money without Renting

posted Jun 10, 2017, 10:30 PM by Jack Marrows   [ updated Jun 10, 2017, 11:45 PM ]

"Buy your home, rent money is dead money", a piece of advice we have all heard but perhaps haven't taken the time to interrogate. Having just recently purchased a home myself I thought I would take the time to draw my own conclusions. The home I purchased was 10km out of Brisbane and as such, my findings are specific to this area.

Dead Money

When calculating dead money involved in renting or buying what is core is that the rate of dead money changes over time. The economic climate and personal financial situation therefore plays greatly when deciding if you are better off buying. When we look at dead money at a single point in time we must consider:
  1. Home loan interest depends on interest rates and the amount of money borrowed from the bank -
    1. If interest rates go up, so does the dead money spent
    2. If the amount owing is reduced (as it does through the course of a loan), so does the dead money spent
  2. Inflation impacts all expenses other than the amount spent on home loan interest 
  3. Opportunity cost at the outset, for example, if you have $180,000 of capital tied up in the property, this amount will increase over the course of the loan
Weekly rent in Brisbane covers charges that the landlord pays. Below is a comparison of the dead money that will be spent in purchasing a place compared to renting.

Renting - $480/wk

Buying - $539/wk

 ExpenseAmount (/wk) 
 Rent$480 


 ExpenseAmount (/wk) 
 Rates$ 107 
 Water$ 30 
 Maintenance$ 19 
 Home Insurance$ 12 
 Home Loan Interest$ 337 
 Opportunity Cost$ 34 


At the outset, given for someone in the circumstances above, they would be spending more dead money buying then if they rented. Let's now start modelling to answer the question when this will change.

Scenario 1 - everything is static 

Change is the only thing we can be sure of therefore, this is a scenario we can be certain will not occur. 

Scenario parameters -
Interest Rate3.74% 
Inflation      2% 
Additional Loan Repayments $0 
Loan amount $469,000 

In this scenario, we see rent increase with inflation and interest decrease as the loan is paid off.


 Total dead money over 30 years
 Renting $1,057,790 
 Renting (less opportunity cost) $941,135 
 Buying $689,100 


The point of intersection for dead money is ~ 2 years into the loan when rent and buy dead money is ~ 500/ wk. At this point the amount owing on the  home loan would be $451,000.

When opportunity cost is considered, this point is pushed out to ~ 5 years when rent and buy dead money is ~$487/ wk. At this point the amount owing on the home loan would be $421,000.


Scenario 2 - As expected, economist predictions occur, additional repayments made 

Economists are predicting that interest rates will increase by about 2 pts, people taking home loans are being asked to budget accordingly. In this model we assume it happens within the next two years and then the average over the remainder of the loan stays the same.

For this scenario, we also assume additional repayments can be made on the home loan, cautious borrows plan to do this. 

Scenario parameters -
Interest RateYr 1 - 3.74% 
Yr 2 - 4.74% 
Yr 3 - 5.74% 
Inflation      2% 
Additional Loan Repayments$1000/ mth 
$10,000 once off 
Loan amount $469,000 

In this scenario, we see rent increase with inflation and interest decrease as the loan is paid off. However, more interest is charged and it is assumed that the opportunity cost is also higher, this is because average interest rates are higher - the cash rate has gone up.


 Total dead money over 30 years
 Renting $1,057,790 
 Renting (less opportunity cost) $623,607 
 Buying$672,348 


The point of intersection for dead money is ~ 8 years into the loan when rent and buy dead money is ~$558/ wk. At this point the amount owing on the home loan would be $321,670.

When opportunity cost is considered, this point is pushed out to ~ 17 years when rent and buy dead money is ~$350/ wk (note, rent is actually $672 per week however, you have $553,972 invested that helps to reduce the cost). At this point the amount owing on the home loan would be $95,000.


Scenario 3 - an historical scenario - bought a home in 1959

In 1951 the standard variable homeloan rate was 5%, it is 5% right now too. From this year onwards through to 1989 interest rates grew until they reached an annual average high 16.45%. In this scenario, I have modelled if this was to happen again.


What is also obvious is that inflation was also very high. As such, this is the only scenario where I have also adjusted inflation over the period. This means that rents and even return on the assumed cash investment are a bad thing for renters.



Scenario parameters -
Interest RateIncreases as per chart above.
Inflation      Increases as per chart above.
Additional Loan Repayments$1000/ mth 
$10,000 once off  
Loan amount $469,000 

In this scenario, we see rent increase with inflation and interest decrease as the loan is paid off. However, more interest is charged and it is assumed that there is a greater return on investment too - the cash rate has gone up.


 Total dead money over 30 years
 Renting $ 2,167,337 
 Renting (less opportunity cost) $ 2,849,291 
 Buying$ 962,845 


The point of intersection for dead money is ~ 1 year into the loan when rent and buy dead money is ~$495/ wk. At this point the amount owing on the home loan would be $438,140.

When opportunity cost is considered, this point is pushed out to ~ 5 years when rent and buy dead money is ~$477/ wk. At this point the amount owing on the home loan would be $349,829.


Conclusion

Minimising Dead Money

As demonstrated by this article, the amount of dead money you spending on your home (buying or renting) can be modelled easily. The lever that can be moved to minimise dead money is the size of your deposit relative to the cost of the property.

Consider that the points of intersection for me across scenarios (including opportunity cost) was -

# ScenarioYears Home Loan Amount
 1Everything is static. 5  $ 421,000 
 2As expected, economist predictions occur, additional repayments made. 17  $ 95,000 
 3An historical example, bought a home in 1959 5  $ 348,829 


In the above models the purchaser should have saved a deposit between $48,000 and $374,000 larger to minimise the dead money spent. Or bought a place that was worth less money.

Should I Buy a House?

The answer is not black and white, it depends heavily on your circumstances. If you are buying a house and have the funds to minimise dead money anecdotally, it looks like a sound decision.

In the models we explored in this article, without minimising my dead money spend, 2 out of the 3 scenario modelled indicated the buyer would be financially better off than the renter. This is before we consider -
  1. Capital growth on the property (you may get this from other investments too)
  2.  The fact that dead spend was not minimised in any scenario
See below the summary amount of dead money spent by scenario -
 #Scenario Buying Renting Renting (less opportunity cost) 
 1Everything is static. $ 689,100  $ 1,057,790 $ 941,135 
 2As expected, economist predictions occur, additional repayments made.$ 672,348 $1,057,790 $ 623,607 
 3An historical scenario - bought a home in 1959. $ 962,845 $ 2,167,337 $ 2,849,291 

Massive Omissions and Assumption

In an model there are a lot of assumptions or omissions, in this case, off the top of my head:
  1. The model does not take into account capital growth, when you sell a house it may be worth more or less, for a lot of people it is worth more.
  2. Calculates opportunity cost as though it was invested in a term deposit or netbank saver, there are more aggressive investments that could make more money.
  3. Assumes that rent and home ownership dead money will stay consistent with inflation rate.

Interface Middleware - Who needs it? 6 Thoughts for Interface Teams

posted Jul 30, 2015, 3:13 PM by Jack Marrows

I write this blog as interface lead on a large Customer Information System project. During the project my team designed and built over 100 interfaces between applications internal to the solution and 14 third party vendors.


Message queues and service buses are commonplace in enterprise technology architectures but they do not spell integration success and in many cases core principles will be key to working interfaces. I will share with you my lessons learned from my most recent project and key considerations when architecting a solution.


  1. Understand Out of the Box Integration First - products from the same vendor may be built to integrate out of the box. Understand how this works prior to making any decisions around required tools or service buses especially, prior to installation. The team installing the software will follow the user guide which will detail default configurations and may not make full use of additional products purchased for the project.


  1. Ask do we really need middleware (SB or MQ)? - there are good reasons to purchase a middleware product such as: enterprise SOA, message throttling, consolidated logging and monitoring, guaranteed delivery. However, from experience enterprise products offer these features without middleware and most interfaces are file based. Prior to deciding on a tool consider whether a combination of the core system and a batch scheduling tool cannot meet the requirement. A majority of the interfaces on my project were file based and used the batch tool for file transfers.


  1. Architecture first, implementation second - interfaces almost define themselves after guiding principles and common patterns are defined.

    • Define a pattern (aligned to your infrastructure) for file based interfaces, real time interfaces and queue based interface - we had minor variations on these and ended with ~15 patterns.

    • Define archiving requirements and how they will be achieved within the infrastructure. For example, how long must payment files be stored after receipt from a vendor?

    • Define logging and alerting requirements and how they will be achieved within the infrastructure. For example, what should be logged and when should people be notified.


  1. Keep business rules out of integration - integration should be transporting and routing of messages. Unless a business rule is required to route a message then all other logic should sit within the core applications. For example, payment eligibility rules should be within Customer Care and Billing and not OSB. The core system owns the data and the respective rules. (there are exceptions to this rule)



  1. Prototype prior to full execution - this one goes without saying, prototype and test the build of each pattern prior to full implementation across all interfaces. Thorough testing during the prototype phase will greatly reduce defects in later phases.


  1. Most importantly it all comes down to schedules, directories, certificates, file names, message formats and credentials. Agree these six details with your interfacing partners and the build team can make it work.


This was of course only my experience I would be interested in the experience of people on other projects.

Virtual government – could your government operate remotely?

posted Mar 23, 2015, 4:47 AM by Jack Marrows   [ updated Mar 23, 2015, 4:53 AM ]

I write this post after reading The New Digital Age: Reshaping the Future of People, Nations and Business by Jared Cohen and Eric Schmidt and, The Accenture 2014 Technology Vision.

Globally, virtual and physical worlds are merging, a trend that does not show any signs of slowing. More and more social, business and consumer transactions are occurring in the virtual space creating new efficiencies and vulnerabilities for organisations.  As this shift takes place I believe that enabling government services to operate mostly in the virtual world can deliver a new level of security that was historically not possible.  Virtual government can ensure stability in the growing virtual world even when there is severe uncertainty in the physical world.

This concept can be explored at a high level in the context of two very real scenarios: government stability during civil unrest and government operations after natural disasters. Traditionally in the physical world after a civil uprising or natural disasters key government services have been lost and in some cases important records destroyed. 

A few examples of real consequences following events include:

  • Healthcare (even advice) can become unattainable if hospitals and doctor surgeries are destroyed or unreachable
  • Education is put on hold while physical school environments are unreachable
  • Corruption can emerge in security forces

If a government can continue to function virtually in light of such events from a more stable physical locations the above impacts can mitigated to an extent:

  • eHealth can ensure that patient records are not lost and accessible immediately after a disaster. Furthermore, virtual conferencing facilities can deliver advice where it is needed in an instant.
  • Virtual classrooms ensure learning continues in the direst circumstances. Additional stability can be delivered through the virtualisation of all core education infrastructure such as curriculum materials, student development data and achievement reporting.
  • Virtual security support systems (comms and HR) ensure clear leadership during a disaster and that security personal needs are effectively met – for example they are paid and family looked after. 

The Accenture Technology Vision 2014 identified the trend Architecting resilience:“Built to survive failure”. The vision mainly focusses on the resilience of IT infrastructure but it is clear that current IT developments enables building truly a resilient government services.

HP+S organisations should be asking is “Could your government continue to operate after a significant disaster in the physical world?”

Word Processing Accessible to Everyone

posted Oct 24, 2011, 2:57 AM by Jack Marrows   [ updated Oct 24, 2011, 2:57 AM ]

Google Docs is an online productivity suite offering a word processor, spreadsheet and presentation application (Google, 2010). Despite being a web application, its functionality and user interface places it in direct competition with traditional desktop applications such as Microsoft Word and iWork.

Docs boasts many advantages over its desktop counterparts. Documents:
  • are easily shared, instantly updated and can be accessed by multiple people simultaneously, 
  • offer added functionality through application programming interfaces such asGoogle Lookup and 
  • are backed up on many servers to ensure they are not lost. 
To encourage collaboration and ease of access, all documents are searchable anduniquely addressable through a URL. Furthermore, Google Docs ensures files can be accessed from any location with an internet connection, without the need to install an application (Strickland, n.d.).

Web browsers are responsible for rendering Docs’ user interface (UI) and a majority of the heavy processing is done server-side. Most popular web browsers are supported (see below) and when cross-referenced with browser usage statistics we find that 88.3% of the world’s browsers can use Docs. HTML and Javascript is used to render the UI client-side. AJAX script creates regular links to the server instantly updating the document when it is edited (Strickland, n.d.).

Docs is subject to limitations common amongst rich internet applications (RIAs). Browser, broadband speed and reliability limitations means Docs can be slower to access data and respond to user input when compared to desktop applications. File storage is limited by the allowance offered by Google (currently 1gb free). Furthermore, security concerns are often raised surrounding documents being stored online. Finally, Docs functionality is limited compared to the likes of MS Word (Strickland, n.d.).

ThinkFree is a word processer, similar to Docs. However, before using the RIA users download a java applet, this allows the application to offer greater functionality. This may be an avenue Google may need to take in the future to compete with the functionality of desktop applications (Gottipati, 2007).

Google docs and ThinkFree represent only a couple of RIAs currently on the market, with many more available or in development. Compliance with HTML5 standards will require browsers to perform tasks traditionally performed by operating systems and RIAs may replace a majority of desktop applications. This shift will see these services (applications) offer consistent interfaces and functionality, to everyone, anywhere, at anytime without the need for high powered processors or installation.

With the global movement towards RIAs is there any applications that won’t work as an RIA?

References

 


ChatRoulette - Fail Fast, Scale Fast

posted Oct 24, 2011, 2:54 AM by Jack Marrows

       
Watch This Blog Above

There wasn't too many Internet start-ups in the years following the 2000 dot com bust. However, ten years later the story is very different and it has nothing to do with venture capitalists re-entering the market rather lightweight, scalable business models that have been made possible through cheaper hardware, free open source code, free marketing and distribution and powerful programming languages reducing the need for large development teams (Graham, 2008).

Competitiveness in the global online market relies on developers delivering products before their competitors and when they arrive they need to be cheap. Speed is of the essence as Van Grove (2010) puts it "the company with its name in lights is the company that most often will prevail". She was speaking about a recent start up ChatRoulette which is a service where users have video conversations with random users around the world. ChatRoulette is a great example of a business which experienced growth at a viral level a scaled appropriately. 

How a company receives revenue is directly related to how much the consumer pays for the product. Using the example of ChatRoulette the service is provided for free and revenue is generated through advertising. Costs a kept low because the bandwidth of the video is not ran through the server (CamChat, 2010). ChatRoulette's founder is quoted saying he turns a profit through such advertisement. Watson(2010) has suggested it is important for such companies to diversify possible revenue streams because the advertising market cannot alone be relied on. 

Viral marketing is a cost effective method of promoting a new product and it doesn't happen by accident. Web pages such as YouTube ornew.com make it easy for users to share their content with their friends using buttons such as email or share this article. Considering our example of ChatRoulette Andrey Ternovskiy (founder) used internet forums to publicize his new service (Bidder, 2010). Word of mouth caused his service to go viral and move from 500 users per day to 1.5 million. To cater for the sudden growth in demand Ternovskiy relied on outsourcing.

Businesses can access server storage and bandwidth as a service meaning they only pay for what they use (see one services pricing below).Chatroulette used a similar cloud service which enabled them to quickly scale with demand when the service went viral. Ternovskiy also outsources development. He is currently employing four programmers from around the world who are working on improving the service. Outsourcing provides companies such as Chatroulette cost effective flexibility and allows them to focus on their core activities (Bucki, 2010). 

By utilizing re-usable code, viral marketing, multiple revenue sources and outsourcing hardware services can experience fast growth and make quick returns on relatively (to traditional models) small investments (O'Reilly, 2005). Chatroulette took 17 year old Ternovskiy 2 days and 2 nights to create and is now worth an estimated 10 million euros (Bidder, 2010). He focused on the core functionality, released quickly and without the need for venture capitalist . This new business model is known as lightweight and scalable and is profitable (Zawodny, 2004).

References


The Long Tail of Steam

posted Oct 24, 2011, 2:52 AM by Jack Marrows   [ updated Nov 4, 2011, 7:05 PM ]

 
New! Watch this blog on YouTube (see above)


Traditionally (before e-commerce), for a product to be successful it needed to have high-volume appeal. However, due to cost reduction and better targeted marketing (possible due to the Internet), money can be made selling to niche markets. Collectively niches can generate sales numbers similar to or better than traditional markets (Davis, 2005).

Steam is an online digital entertainment company that currently offers 1100 games for purchase, download and play from any computer (Steam, 2010).Compared to department stores that rarely offer more than 100 titles (usually the more popular ones), Steam’s range is simply larger and caters better to specific interests rather than current trends. Such business models are what Anderson referred to when he coined the term leveraging the long tail (Anderson, n.d.). 


Traditionally retailers only targeted the ‘head of a market’. However, collectively, products in the tail have the potential for similar or greater sales (Anderson, n.d.).

Companies such as Steam are able to cater to these niche markets by using new technologies to reduce costs (Wikipedia, 2010). Users enter, edit and update their personal information negating the need for some customer service employees. Steam users enter their own details from payment information through to their billing address (Steam, 2010). Self service extends to support, using the Steam forums and knowledge base gamers can ask their peers technical questions. This reduces the amount of money spent on support staff.

Products purchased from Steam are delivered virtually over the Internet in a digital form (Steam, 2010). In contrast to traditional business models, Steam does not pay for products until they have been sold to a customer. Furthermore, like other companies that trade in digital content there are few costs associated with a physical retail front or delivery.

Where the cost of inventory storage and distribution is low it becomes profitable to sell relatively unpopular products (Wikipedia, 2010). As a result vendors no longer need to put consumers in a one size fits all container (Anderson, n.d.).However, as the tail lengthens, consumers may find it hard to navigate to products of interest. Sites such as Steam use algorithmic data management to match supply and demand (O’Reilly, 2005). Based off pages a user visits, Steam recommends other games they might be interested in.

In summary, as Anderson (n.d.) suggests, People gravitate towards niches that are narrow interests (and everyone has them). Trends show that through using new technologies to reduce costs collectively, niche markets are as economically viable as their traditional high appeal counterparts. Businesses can now be built on hitting the growing niche market (Holter, 2006).

As for the future of Steam, I would like to see them follow the trend of the Apple App store and allow more independent developers to submit games for sale (this would lengthen their tail too).

References

Mindmeister: The Perpetual Beta Without the Beta Tag

posted Oct 24, 2011, 2:51 AM by Jack Marrows   [ updated Nov 4, 2011, 7:03 PM ]

“There's no final version. Nothing is static, everything is changing.”(Chitu, 2007). This is how Google defines the idea of perpetual beta, software as a part of an indefinite and continuous development cycle (Wikipedia, 2010). This is the new way of thinking about software development and one that gives young companies an advantage because they are not in the habit of the traditional software release cycle.

Mindmeister is a collaborative mind mapping tool that demonstrates the Web 2.0 principle of ‘Perpetual Beta’ and has seen success through these innovations.Meisterlabs has made operations a core competency. Their security page boasts 99.9% uptime and the highest data center standards (Mindmeister, 2010) and their product is updated many times a month. Such measures are essential for web 2.0 applications that are offered as services because they won’t perform well without regular maintenance (O’Reilly, 2005). 

Mindmeister user numbers

Meisterlabs subscribes to the benefits of releasing their software early and often.Through the change log it is clear that their service is updated multiple times per month. The improvements made however, are usually small and not always noticeable to users. 

Improvements are able to be regularly rolled out through utilizing light weight, flexible and cross-platform programming languages. O’Reilly (2005) refers to such examples as php, AJAX or ruby on rails as prime examples of code which can be written quickly to enable responsiveness. Mindmeister uses HTML, CSS and AJAX client side to allow their application to be rendered by most browsers. Ruby on wheels is used server side (Hollauf, 2010). 


Mindmeister uses beta testers who post reports on bugs they identify.Mindmeister aims to be responsive to these reports. O’Rielly(2005) identifies the use of shadow applications which can collect data on how an application being used. These should be planned and implemented with the main application.Based on the data (see below) Mindmeister is able to provide it can be assumed they use shadow applications. Other sites such as Google and Amazon have been know to engage a small percentage of their users in such tests before rolling out products. Testing is used trial new business strategies and application functionality. 


Web 2.0 means consumers are now purchasing services instead of artifacts.These services need to be maintained to respond to user requirements and through leaving an application in perpetual beta products see a faster time to market, reduced risk (less up front cost), a closer relationship to customers and the ability to quickly and effectively respond to real time data (O’Rielly, 2005). 

Question for Thought

Will we see the Beta label return to products when large versions are released? 


References



Location, Location, Web 2.0 - Foursquare is Built for Many Devices

posted Oct 24, 2011, 2:48 AM by Jack Marrows   [ updated Nov 4, 2011, 7:01 PM ]

Foursquare is a location based social networking game. Users ‘check in’ at locations announcing they have been there in competition to become mayor. People contribute through submitting their thoughts, locations, pictures, information about their current location and friends (Wikipedia, 2010).

Utilizing features only available on cell phones has heavily contributed to Foursquare’s functionality and success. Photos are taken using cameras, locations recorded using cell location services and the application relies on a phone’s connectivity (Foursquare, 2010). Combined, these features have created an application that couldn’t exist solely on a desktop computer or phone.

In the company of desktop computers, Foursquare can be used on iPhones, Android phones, Blackberrys, Windows mobiles and systems running webOS (Foursquare, 2010). An application is currently being developed for Nokia handsets (Guim, 2010). The way each platform is used to its strength and data is shared between devices and services, makes Foursquare a great example ofsoftware above the level of a device.

O’Reilly (2007) uses the example of iPhones being managed by iTunes to show how software can work in tandem to ensure a better user experience. Foursquare follows a similar model allowing users to manage their profiles online. This means users aren’t required to type or read great amounts of information into/from their phones. The mobile applications are then left to perform the task they are great at, being mobile and more so, immediatelyrecording users’ thoughts or pictures in the context of location, harvesting rich forms of media. 


Foursquare intends to generate revenue through advertising. Businesses can advertise directly, targeting people who are regularly nearby their businesses in the form of couponing (Carlson, 2009). Data generated by Foursquare has been used to create mash-up applications.

Fourwhere, a desktop web application, uses collected data to display comments made through Foursquare on a map. This is a great example of how Foursquare extends the usability and functionality of their data using the context of location (Sysomos, 2010). 


Notably, Foursquare encourages usage by allowing users to import friends from Facebook and Twitter. These services are also used to broadcast a users location.

In summary, Foursquare utilizes the strengths of many devices, including a broad range of phones and desktop computers, to harvest rich media in the context of location. Thinking forward we may soon see such applications incorporating the accelerometer and future phone features as they are released and patterns indicate this trend may lead to richer data being harvested without an increase in conscious human interaction.

Questions for thought

What other data types could be harvested without the need for human input?


References



Digg as a Platform

posted Oct 24, 2011, 2:45 AM by Jack Marrows   [ updated Nov 4, 2011, 7:04 PM ]

Digg is a web application which promotes popular web content based on user ratings. Recently the CEO of Digg announced that the company was now EBITDA profitable which is a significant step in the organisations’s history (Sykes, 2010).

To be competitive on the real time web, Perez noted (2010) that Digg needs to increase traffic to their site. Currently, it may take a few days for a link to reach Digg’s homepage where Twitter could spread the link in a matter of hours. Large scale uptake of Digg’s API will encourage such traffic.

Digg's API

The Digg Application Programming Interface (API) has been created to let developers and partners interact with Digg's platform. Digg’s API allows developers to integrate its core functionality and data into their own application or a website. Functionality includes digging activity, rating and commenting on links (Digg, 2010).

Developers extend the functionality of Digg through creating mashups. These are often add ons or third party applications that interact with other APIs to offer new functions. Digg supplies a wizard to offer less experienced web developers, the script they require to incorporate Digg content and functionality into their own site, further increasing traffic (See picture below). 


Developers can use the API to request very specific information about news stories, images and videos submitted to Digg (Digg, 2010). Applications request this information using REST and may use multiple response formats including XML and Javascript (full list).

Developer Support

Digg support their developers through offering Digglite (an open source platform to be modified or built on), extensive documentation about the API and an online community for discussion and support.

Terms of Service

Digg’s API is provided “as is” (Digg, 2010). Consequently, Digg isn’t legally required to support any failures however, if they didn’t offer support I suggest people would stop developing with the API. Furthermore, the TOS Implies developers can pay for Digg API support.

Conclusion

Digg provides developers with a strong platform and through assembly in innovation has seen massive scalable growth. The company makes money through offering advertising in a format that looks like Digg content (Gannes, 2010) and with increased traffic can continue to compete with social media sites and be profitable. Traffic can be secured through continued, stable management of the Digg API. Like all web 2.0 applications the more people who use it the better it gets.

Question for Thought

With so many mashups and third party applications being developed, will Digg still need a homepage in 10 years?

References

The Value of IMDB's Data

posted Oct 24, 2011, 2:41 AM by Jack Marrows

'Data is the next intel inside' refers to data being more valuable than proprietary software licenses and the need for businesses to establish a data strategy. Many web applications offering low functionality, achieve great success through appropriately harvesting and managing data. Today I am going to analyze the role data management has played in the success of IMDB.com (Internet Movie Database).

IMDB hosts data about movies, actors, fictional characters, television episodes and video games. The company operates as a subsidiary of Amazon.com after a deal was struck in 1998. This allowed Amazon to advertise on the site and ensured the data would remain available to the army of volunteers who contributed to the site’s content (Wikipedia, 2010).

Data is submitted by public contributors, verified by paid managers and published on the site (Wikipedia, 2010). This data is then enriched through user interactions such as ratings, reviews and comments. Interestingly, unlike Facebook’s business model IMDB do not take ownership of the data users upload (IMDB, 2010). This may help to manifest trust and loyalty amongst its users.

Data hosted by IMDB has been accumulated since 1990 (IMDB.com, 2010) and is fundamental to the company’s revenue. The quantity of data hosted would be very expensive and time consuming to recreate which has secured the organization as a dominant player in the industry.

IMDB content can be used to power movie, television or celebrity projects and is marketed as authoritative and accurate information (IMDB, 2010). This data is sold as a subscription at $15000 per year as of March 1, 2010. Clatworthy (2010) suggests this will lead to no free application programming interfaces(APIs) being publicly available. Lack of APIs may have a walled garden effect and mean data is not extended beyond the limitations of IMDB’s functionality and data harvesting capabilities. In this linked article O’Reilly has outlined some applications an IMDB API could enable (O’Reilly, 2006).

In summary, IMDB illustrates how valuable data is however, O’reilly (2005) states
“we expect the rise of proprietary databases to result in a Free Data movement”
and this is already on the horizon with omdb.org(Open Movie Database) gaining momentum. If IMDB wants to stay competitive they may need to freely share their databases otherwise an open data solution may reel in its dominance.


References




1-10 of 11