Proof of testing 887



  • I’d be interested in getting feedback from anyone who has been audited on the following:-
    If you have a process in place whereby power users test a change and provide their Authorisation for that change to go live. How important is it that the proof of testing needs to be retained in the change management system?
    I believe in our case the testing is not what you would call highly documented, the company is relying on the experience and skills of the power users to perform adequate levels of testing. A situation that has been going on for years without major problems.
    What scope do you have in arguing that as a proven process with a history of success management belive it is appropriate for the level of risk it holds.
    Regards



  • In our change control process end user testing must occur and they must sign off that they are satisfied with the testing results. Also we require documentation of the change whether it be a copy of a report, a screen shot of a before and after schedule, scripts etc. That is then retained with the hardcopy of the change control.



  • Our procedures have testing done first within IT and then we require approval from the user before changes can be moved into production. We probably are about 50/50 on retaining testing results done by IT and it really relates to the complexity of the change. If we are doing a major release or patch type upgrade, we definitely keep the test results. If the change is related to a screen change, screen shots are all we keep. If we provide the user with a report for them to approve, we keep the report. But, regardless, we still require user approval. We were tested on this last year and it was noted that we did not have some test results. We still passed.



  • I am new to Sarbanes-Oxley, so if my comments are not relevant, I do apologize.
    In my opinion, the best testing methodology is one in which test conditions and cases are determined in parallel or prior to specifications for a modification. In other words, when (specifications for) a change is developed, the affect on the functions of a system are also analyzed and a determination is also developed for how to test the new functions. The tests could be merged with other test cases that already exist for the system.
    I have read a few discussions in this forum on the subject of testing and none of the prior threads seem to describe testing methodologies in this manner. Is it because I misunderstand what is meant by testing?
    I am not an auditor, accountant or manager. I am a Programmer/Analyst with many years experience with maintenance programming. I am attempting to learn about Sarbanes-Oxley as fast as I can but I just begun. I have chosen to start learning by going through the messages in this forum, so if my comments are not relevant, I apologize. If you have a suggestion for what I need to read that is most relevant here, then please say so.
    I apolgize for hijacking this thread; I hope my message is appropriate.



  • Welcome, Sam, to our corner of the (SOx) world.
    Don’t apologize, please…your comments are interesting.
    I’m not an IT auditor, but what you say seems to make sense on an intuitive level, although I think that the General IT/COBiT part of SOx has been pretty fluid (and, there have been fewer ‘best practices’ than on the accounting cycles/processes).
    I’m looking forward to reading some of the comments from the IT folks among us.
    John



  • In my opinion, the best testing methodology is one in which test conditions and cases are determined in parallel or prior to specifications for a modification. In other words, when (specifications for) a change is developed, the affect on the functions of a system are also analyzed and a determination is also developed for how to test the new functions
    Welcome and thanks for sharing 🙂 I agree with these recommendations and will also share some quick thoughts on testing:

    1. Expected test results should be predetermined in advance. The advanced determinations can be to match up with a production report or to match pre-determined results computed manually by the users.
    2. All expected and unexpected differences must be carefully researched and documented. Hopefully, a good test sample was created that is representative of the data conditions that are found in production.
    3. Results should be stored in Project Libraries (usually special folders on network shares, so that history can be retained)
    4. Security is important for any test results (esp. SOX compliancy tests) that must be part of the official record. These must go through a change management process, where only an authorized approver can publish results in a restricted network share.
    5. Some companies may also see a need to retain the actual paper copies as well.


  • Thank you for your replies. I at least know that it will be worthwhile for me to create (new) threads on the subject.
    I did not state explicitly that what I was saying is or is not a SOX requirement; I think it is not a requirement, but I think it is the better way to test and should (could) be a valid alternative. As best as I understand SDLC, it is basic to all good desogn techniques.



  • A good whitepaper on Best Practices and SOX testing:
    infotech.aicpa.org/NR/rdonlyres/F8823A4C-9767-4DD3-9CD9-D9E83A934283/7888/GTAG3Brochure1.pdf
    The document provides strong justifications for implementing a realtime approach to execution of tests of controls performed in connection with SOX compliance efforts.
    As in human development, one must learn to crawl before walking and learn to walk before running–similarly, an organization will perform tests of controls periodically before evolving to a stage that it can develop and successfully embrace a realtime controls monitoring approach.
    Milan



  • Harry, while I agree with your theory on predetermined test results, let me ask a question. What about scenarios where your test environment is dynamic and not a mirror of production? Any number of developers could be testing and updating the numbers as a result. If someone builds expected test results today, they could be invalid tomorrow. Certainly someone needs to prove that the results are accurate at the time of testing, but depending on the development timeline (days vs. months), building expected test results could be an exercise in futility. Is it acceptable in those scenarios to have generic test results such as ‘the numbers in column X should total the number from column Z on report Y’?
    A more general question: What is the overarching value in documenting test results if someone is agreeing that he has tested (and system logs confirm the execution of the related programs)? Are there penalties for not documenting results? Do those penalties outweigh the time and cost of storing a document for every change made to a set of systems? Just curious.



  • What about scenarios where your test environment is dynamic and not a mirror of production? Any number of developers could be testing and updating the numbers as a result. If someone builds expected test results today, they could be invalid tomorrow.
    Hi and welcome to the forums 🙂 I agree and this is also valuable to take into consideration from the viewpoint of expected v. unexpected test results. Some comments:

    1. Predetermined results in a dynamic environment are best derived manually. In these cases, a user might calculate what the end-result on ‘paper’ (e.g., Excel spreadsheet) and then cross check it with the test run.
    2. Sometime predeternined results cannot be determined in advance. This can occur based on sort orders (e.g., in one run records with the exact key may not be sequenced the same way in the next run – and your tests could have slightly different results). Usually, these types of differences are minor and explainable.
    3. If the goal is to match production results, then you can use production as your test baseline for comparative analysis. This is often the case.
    4. SS and I were sharing ideas more in the context of traditional testing, rather than SOX requirements. For SOX 404 related activities, the context of ‘sampling and testing’ are more about:
      a) Are work flow controls are being followed?
      b) Do the sampled records being tested match production from a financial and accountability standpoint?
      c) Are there any annomolies or unexpected differences? (e.g., did someone accidently or even intentionally alter the financials)
      A more general question: What is the overarching value in documenting test results if someone is agreeing that he has tested (and system logs confirm the execution of the related programs)? Are there penalties for not documenting results? Do those penalties outweigh the time and cost of storing a document for every change made to a set of systems? Just curious.
      Milan shared an excellent document on best practices related to testing controls for SOX:
      infotech.aicpa.org/NR/rdonlyres/F8823A4C-9767-4DD3-9CD9-D9E83A934283/7888/GTAG3Brochure1.pdf
      For your day-to-day system testing, you can use your current approaches, although as this 43 page document reflects the organization can improve quality by being in a ‘continuous improvement’ mode for testing methodologies. Having working in IT for over 30 years, there’s definitely ‘no one size fits all’ when it comes to proving out the quality and reliability of a new system or a system change.
      As noted in point #4 above, there are some special considerations for SOX testing of sampled financial records or control processes, that have to be done in a special manner.
      Thanks for sharing these good questions.


  • Hello everyone,%0AI was trying to look up information while researching on SOX based IT audits and this forum has helped me enormously. BTW I stumbled across this forum from the ISACA-DC website. %0AMy background has been IT auditing for Pharma companies primarily from an FDA perspective. Because of the FDA regulations and somewhat better guidelines (because they have been around for a while now) most Pharma companies have pretty good IT controls and processes already in place. So it was pretty easy for us to be SOX complaint also. True, some of the systems having material impact on the company’s financials did not need to be compliant from an FDA perspective but because the corporate IT processes / controls (incl. policies and procedures) were already in place implementing them was not a major effort. The underlying principle of controls and implementation of best practices are very similar. I have recently moved on from the Pharma world to Financials as an IT auditor but primarily for SOX initiatives.%0ANow onto the subject at hand…proof of testing. BTW, this is based on solely my observations with 4 different major Pharma Cos. In the pharma world testing is a very controlled exercise from a documentation and environment perspective. I would say almost 80% of testing (and I am talking of ‘Testing’ from a SDLC standpoint) is well documented (approx 20% is probably ad-hoc where testers attempt to tear apart the system). Besides the standard industry practices like having test scripts with pre-defined expected results, pass/fail criteria Pharma takes it a few steps higher. Typically test procedures need to be approved (be it a SME, developer, peer or manager) prior to formal execution on a controlled environment. Fixes / patches installed on account of bugs reported from the formal testing process need to be code reviewed and have proof of unit testing before being deployed in the test environment. Developers typically may / may not be allowed access to the Test environment and never on UAT or PROD. Test execution requires documented proof in the form of VOEs (Verifiable Objective Evidence)…be that in the form of logs, screen captures or reports. One place even required a witness. After execution the script along with the VOEs are reviewed and approved by a Manager / Lead. The hard copies of the scripts along with the VOEs are signed/initialled/dated and eventually archived and during internal / FDA audits are retrieved and reviewed in detail. Electronic copies are stored in a controlled electronic document management system (e.g. documentum, qumas). Now, all this is fine and dandy but does slow down the whole implementation cycle not to mention the additional jobs (;-)) and most pharma companies do not compromise on these for fear of the FDA. %0AWhile the pharma world is on one side of the spectrum what would / should be the sufficient to prove testing was done in the non-pharma world? I think my judgment tends to be prejudiced having been in Pharma for so long so I am trying to find the middle ground as an IT auditor venturing outside of pharma. Thank you and I hope my blabbering on my background gives more perspective to my post.%0AThanks



  • I work for an IT consulting firm.
    We have a large financial client who asked us to assist them regarding Sox Testing Policy and Proceedure.
    They want us to provide them someone who will be responsible for ensuring their Testing Policy and Proceedure are in line with SOX.
    They have two legacy systems and their technical teams will provide our SOX cousultant with evidence that they are compliant from a policy and procedures standpoint.
    We need to provide a person that can take the clients evidence and match it up with the actual policies and procedures handed down from the SOX teams and make sure they are in compliance.
    My question is, is there such thing as a technical testing SOX type person, or is this role just more of an auditor. :?:



  • My question is, is there such thing as a technical testing SOX type person, or is this role just more of an auditor. :?:
    Hi and welcome 🙂 … The job role you’ve shared could be performed by Information Systems, Internal Audit (IT auditor), or even a highly technical business analyst.
    The use of a person from Internal Audit seems the most logical choice to provide a measurement of SOX testing compliancy. In some respects, they are already used to performing unbiased assessments of policies, procedures, and controls – or in theory at least we hope so 😉 😄
    Someone from the IT team could also perform this function, esp. if they have had experience in writing security policies, procedures, and standards. They would offer skills from the technical side in looking at timestamps and technically related evidence.
    Probably, the business analyst would have the most difficulty transitioning, but if their prior experience and technical skills made them a good fit, you wouldn’t want to rule this out either.


Log in to reply