Article By Ray Scott | Computerworld UK |
According to Tim O’Reilly, the open source advocate: "We’re entering a world in which data may be more important than software." More accurately, I would suggest, we are awakening to a world in which data has always been more important that software. A perfect example of this point is the rise of test automation tools as the ‘next big thing’ during the 1990s – with market-leaders such as Mercury QuickTest Professional and Rational Robot (now HP and IBM) becoming industry standards.
As more and more vendors entered the market with new technologies, it has become possible to achieve the benefits of test automation – increased efficiency, reduced costs and greater resource flexibility etc. – across the entire enterprise. Yet, few organisations have been able to realise these advantages, despite the significant initial investment required. Instead, most software development projects are still forced to compromise quality in order to meet tight deadlines and budgets. This is why it’s time to awaken to the importance of data in maximising the value of our testing effort.
More often than not, testing places a higher focus on the “Happy Path” and less on areas of negative and non-functional testing. In order to rigorously test your application, you need to be able to provision high quality ‘fit for purpose’ data, regardless of which methodologies or tools you use. In short, data matters! Despite this, data continues to be overlooked when designing project requirements. This has major implications for your ability to deliver quality software to market; on time, on budget and within scope. If you don’t understand the data, you can’t understand the business model you are testing.
To read the full article click here.