
But today, years after that data was absorbed from a team that may no longer be around, what procedure could meaningfully evaluate that old data for accuracy? And the longer such an evaluation is delayed, the larger the number of errors that will permeate the environment.
An IT working group could use a variety of guidelines to weed out such data, not by determining the accuracy of the old data, but by identifying large chunks of data that can simply be wiped. An example might be: “Any prospect list that is more than 10 years old should be automatically wiped, given the strong chance that little to none of that data would be viable.”
David Neuman, the COO at consulting firm Acceligence, pointed out that enterprises should also identify databases that should be retained for as long as possible, “such as scientific data, especially meteorological data.”
