All technology applications that use data have one driving factor that dictates much of their success. Quality data. This is true regardless of application type. A B2C consumer application like foursquare®, a technical support knowledge base or a corporate ERP system all need quality data to work well. When it comes to data – quality can mean accuracy, quantity, current and many other terms that dictate whether or not the data is useful within the application.
What I write above may seem a fairly basic concept when it comes to software applications and you might think why am I even writing about this. If this concept is basic, then why do so many organizations struggle or stumble when it comes to building systems that require quality data to function? To answer this let’s examine two paths that most organizations have to go down related to managing data – one is tactical and one is strategic in nature. In other words, there are impacts to data quality that are short-term and of course the more significant in my opinion, the long-term impacts.
Tactical (a.k.a short-term): Data migrations, the merging of disparate systems, data loading related to new business lines and other batch oriented “point in time” data requirements could be defined as more tactical in nature. For a specific reason a set of data needs to loaded, merged or adjusted to meet a defined need. This could occur as part of a system implementation or simply be related to ongoing business changes. Many organizations get this right. They understand that the data needs to be cleansed and adjusted as part of the process. Resources are allocated, priorities established and a relatively good job is done of processing the data so that it integrates into the system appropriately. The batch data flows into the specific application and starts out with a high level of quality.
Strategic (a.k.a long-term): Let’s be honest, in most systems data quality is a long-term process. The design, implementation and operation of a system need a focus on data quality over the life of the system. So what is it that influences better data quality over the long-term? It is impossible to list everything, however if you are supporting a Microsoft Dynamics CRM implementation the following list details elements that should be part of your strategy:
- Make your CRM system an enterprise master data repository. Integrate it with your other key systems to drive data quality across the organization.
- An organizational application owner and appropriate resources are needed to facilitate a focus on data quality.
- An experienced vendor that will bring “been there, done that” experience to your data decision-making process.
- Less is more when it comes to data. Resist the desire to capture any data that does not have defined business processes behind them.
- Use software tools that will simplify tasks related to data maintenance and cleansing.
- Educate your users. Lack of understanding related to the “why” and “how” of systems contributes to poor long-term data quality.
- Effective use of reports will help give visibility to poor quality data. This can drive better data quality long-term.
- Always implement the Outlook Client for CRM to gain the data quality benefits of integration and synchronization.
- Define and adhere to a strict process for creating master company and contact records. Develop audit reports so that data is improved long-term.
Following these guidelines can put your organization on a path towards long-term data quality.
CRM 2011: Upgrading to CRM 2011 has numerous benefits that drive long-term data quality. Too many to list here. Your organization should have a plan to upgrade if you have not already completed this.