Testing Change Management 1689



  • we are testing for changes made to production environments of two types: data and code.
    For code, we are going from the request ticket to the objects in production and objects in production back to the request ticket.
    For data, it appears this is not possible as time stamps and archive history is not maintained for data since it changes very often.
    How would one test for population validation purposes that we have identified all data changes during teh testing period.
    Thanks.



  • Can you explain how the changes are made (in more detail) and whether or not a log of all changes can be obtained from Systems Reports. Your IS department may be able to trace this back through exception reports?



  • For data, it appears this is not possible as time stamps and archive history is not maintained for data since it changes very often. How would one test for population validation purposes that we have identified all data changes during the testing period
    Hi - I agree that capturing specific changes of information is very difficult.
    One approach might be to create backups of the data periodically and creating ‘baselines’ in which to compare the most current file with. If the data is structured in traditional flat files, this would be less difficult than if the data were in a series of relational tables. If you were using data bases, you may have to capture just the key fields and summary level information into a special table to compare against.
    There are also software packages that might be able to audit changes in data also. For future systems, capturing timestamp when rows/records are inserted and updated might provide a good audit trail for when an entry changed, but it still won’t track each of the individual fields within an entry.



  • What kind of data are we looking at here? Can you eloborate. Data keeps changing all the time through manual or automated entries. Does it really need to go through change management??
    Calvin



  • Calvin makes a great point on examining the fields that need to be tracked. While all data elements can’t be evaluated from a change management perspective, the most important focal point is to evaluate changes of financial information.
    One technique we recently used was baseline (as described earlier) by taking summary snapsnots of the financial master files at points in time (monthly closing balances). Then you can compare to track changes in monetary values, ensuring they are legitimate transactions and complete.
    This may be easier said than done for complex files. For example, snapshots of input files can be blended into the reconciliation process, so that you have to fully account for all inputs. Here you would see how accepted, rejects, and suspended items were accounted for on the financial master file. All out-of-balance conditions would need to be fully explained.
    What I’ve just shared is not necessarily SOX-mandated, but it certainly helps provide assurances that the financial systems are working accurately. The intra-system reconciliation process was so important that we didn’t do this under SOX guidelines per se. Still, it provides some great tools for the financial area and audit, however it will take time and development resources to accomplish.



  • What kind of data are we looking at here? Can you eloborate. Data keeps changing all the time through manual or automated entries. Does it really need to go through change management??
    Calvin
    I agree that data changes all the time…the external auditors (EA) are requesting to validate the population of ‘data’ changes that go through the change management process. These would be changes to data that users would normally do via their screens but due to the volume of transactions, opted for the DBAs to run scripts to change it (e.g. codes for vendors, dates, transaction IDs, etc.).
    They wanted to do a baseline as stated above…I feel that it is overboard for SOX purposes. Data is dynamic it is not easily ‘traced’ back and software to do this is not an option at this company.
    With the reconciliation controls on the business side the risk is minimal if we don’t do the ‘trace’ back or baseline for data changes.
    Would I be missing the boat if I stated that to the EAs?



  • hAve you clarified which changes your auditors are referring to?
    Is there a chance that they are only referring to changes to masterdata?



  • The external auditors (EA) are requesting to validate the population of ‘data’ changes that go through the change management process. These would be changes to data that users would normally do via their screens but due to the volume of transactions, opted for the DBAs to run scripts to change it (e.g. codes for vendors, dates, transaction IDs, etc.).
    Hi - I like EMM’s idea of gaining clarifications from the EAs 🙂 Applying the term ‘change management’ to data is a little confusing – at least to me.
    To me, change management is a process involving approvals, autonomy levels, and other controls, rather than users simply entering information as part of their routine daily job roles? I could see change management concepts, where field formats are changing and you have to promote new schemas, forms, screens, reports, and other items into production.
    Certainly, it’s important to track financial changes and as you shared earlier there is a good balancing and reconciliation process in place.
    I can only see the auditor’s point for a small selection of records , but not the entire master file. Most auditors I’ve worked with like to conduct sampling. They need to follow changes from the front-end systems all the way through the financial systems, for this selected # of samples. That process is reasonable, but not the entire master file, as you can’t humanly look at all changes of information for a large repository of information.



  • Harry,
    I agree with you on the sampling…this is not the issue. they are wanting to validate the population of data changes (mass user financial changes, schemas, table indexes, etc.)
    I argued that the source would be our change management ticketing system. Their request is to validate the ‘data’ changes in a similar fashion to how one would validate code changes (e.g. look at the last modified date for a particular object/dll in production and tie it back to a change request ticket).
    My arguement was that you could not do that with data changes (mass user financial changes, schemas, table indexes, etc.) They’ve since then agreed that going from the database and tracing back to a change ticket is not feasible and will be only testing a sample of change tickets from the population of change tickets for xx to xx dates.



  • My arguement was that you could not do that with data changes (mass user financial changes, schemas, table indexes, etc.) They’ve since then agreed that going from the database and tracing back to a change ticket is not feasible and will be only testing a sample of change tickets from the population of change tickets for xx to xx dates.
    This is a more realistic approach and I’m glad it worked out for you all 🙂


Log in to reply