Discussion:
[SCRUMDEVELOPMENT] Agile SOX
Michael Wollin yahoo@mercurysw.com [SCRUMDEVELOPMENT]
2015-10-21 19:21:01 UTC
Permalink
Hi all,

Where I am coaching, there is a powerful and persistent worry and preoccupation with SOX compliance. I am leading a couple of training classes next week and due to this concern, I will be covering this topic. How best to do that? Figure a half hour at most (including Q&A).

What I have surmised is that part of SOX can be covered in the DoD. And part would be covered in the User Stories as Acceptance Criteria. I also posit that the PO accepting the story might me sufficient for the user acceptance tests, since the PO comes from the business.

Can someone point me to a few representative samples of these?
What else is there to know? Is there anything outside of stories and DoD that is part of how agile handles SOX?

My goal is to get them to a) worry less because it’s handled and b) not to hamstring themselves into some overkill process and c) make sure they can survive or better, avoid an audit. (Indeed we have a project names Phoenix here :).

Michael
Wouter Lagerweij wouter@lagerweij.com [SCRUMDEVELOPMENT]
2015-10-21 20:14:58 UTC
Permalink
Hi Michael,

I'm sure there's people here with more experience in this than I, but I did
work in a project at a large European insurance company where there was a
lot of focus on compliance and traceability. We had a number of sessions
with auditors about the process, and mostly they were very happy with what
happened in the scrum-ish-kanban-ish way of working.

There were a couple of things that we did that were mentioned as being
important for us passing those audits:

- The Definition of Done, as you said, was important to them. For us to
comply, we actually had the DoD as a real checklist, checked off and signed
by the Product Owner and, IIRC, at some point by one of the auditors (we
figured that if they wanted to audit the process, they'd better attend the
demo... :-) These signed DoDs were actually kept and archived as project
documentation.
- The test 'documentation': we used a BDD approach, with the outcomes of
conversations on requirements laid down in gherkin-style test scenarios.
This meant that for each 'requirement' (user story), we had a set of tests
linked. And since we automated those tests, we could show beyond doubt that
the requirement was actually implemented and tested. This was a huge hit
with the auditors, who came in expecting to have to slog through large sets
of 'test plan' spreadsheets, but ended up clicking through our generated
test-reports.
- A weekly (sprint-ly) report with the results of the sprint (just which
stories were done, mostly) was also needed as project documentation. Signed
by a scrum master, and the project/program manager in charge. We didn't do
anything with these documents, but again it was something that made the
auditors and surrounding company happy.

This was, as you might deduce from these points, not the most agile
environment. But we worked with a group of about 60 people (6 teams and
'overhead' like me), in weekly sprints, test driven, with a
pretty-continuous delivery set-up for about a year and with these kind of
measures exposed a more rigorous compliance than our formal caretakers had
been used to. They were a little reluctant in accepting an iterative
approach to the auditing at first, but that settled after a while.

hth,

Wouter
Post by Michael Wollin ***@mercurysw.com [SCRUMDEVELOPMENT]
Hi all,
Where I am coaching, there is a powerful and persistent worry and
preoccupation with SOX compliance. I am leading a couple of training
classes next week and due to this concern, I will be covering this topic.
How best to do that? Figure a half hour at most (including Q&A).
What I have surmised is that part of SOX can be covered in the DoD. And
part would be covered in the User Stories as Acceptance Criteria. I also
posit that the PO accepting the story might me sufficient for the user
acceptance tests, since the PO comes from the business.
1. Can someone point me to a few representative samples of these?
2. What else is there to know? Is there anything outside of stories
and DoD that is part of how agile handles SOX?
My goal is to get them to a) worry less because it’s handled and b) not to
hamstring themselves into some overkill process and c) make sure they can
survive or better, avoid an audit. (Indeed we have a project names Phoenix
here :).
Michael
--
Wouter Lagerweij | ***@lagerweij.com
http://www.lagerweij.com | @wouterla <http://twitter.com/#!/wouterla>
linkedin.com/in/lagerweij | +31(0)6 28553506
'Rohan D'Sa' rohan.dsa@gmail.com [SCRUMDEVELOPMENT]
2015-10-22 05:52:49 UTC
Permalink
Great answer, Wouter!


Michael, I'd asked this question on the Scrum Alliance group a while back.
Those answers might help you as well:
https://groups.google.com/forum/m/#!searchin/scrumalliance/Sarbanes/scrumalliance/nHkaWNE1HQ0


Groeten,
Rohan
Post by Wouter Lagerweij ***@lagerweij.com [SCRUMDEVELOPMENT]
Hi Michael,
I'm sure there's people here with more experience in this than I, but I
did work in a project at a large European insurance company where there was
a lot of focus on compliance and traceability. We had a number of sessions
with auditors about the process, and mostly they were very happy with what
happened in the scrum-ish-kanban-ish way of working.
There were a couple of things that we did that were mentioned as being
- The Definition of Done, as you said, was important to them. For us to
comply, we actually had the DoD as a real checklist, checked off and signed
by the Product Owner and, IIRC, at some point by one of the auditors (we
figured that if they wanted to audit the process, they'd better attend the
demo... :-) These signed DoDs were actually kept and archived as project
documentation.
- The test 'documentation': we used a BDD approach, with the outcomes of
conversations on requirements laid down in gherkin-style test scenarios.
This meant that for each 'requirement' (user story), we had a set of tests
linked. And since we automated those tests, we could show beyond doubt that
the requirement was actually implemented and tested. This was a huge hit
with the auditors, who came in expecting to have to slog through large sets
of 'test plan' spreadsheets, but ended up clicking through our generated
test-reports.
- A weekly (sprint-ly) report with the results of the sprint (just which
stories were done, mostly) was also needed as project documentation. Signed
by a scrum master, and the project/program manager in charge. We didn't do
anything with these documents, but again it was something that made the
auditors and surrounding company happy.
This was, as you might deduce from these points, not the most agile
environment. But we worked with a group of about 60 people (6 teams and
'overhead' like me), in weekly sprints, test driven, with a
pretty-continuous delivery set-up for about a year and with these kind of
measures exposed a more rigorous compliance than our formal caretakers had
been used to. They were a little reluctant in accepting an iterative
approach to the auditing at first, but that settled after a while.
hth,
Wouter
Post by Michael Wollin ***@mercurysw.com [SCRUMDEVELOPMENT]
Hi all,
Where I am coaching, there is a powerful and persistent worry and
preoccupation with SOX compliance. I am leading a couple of training
classes next week and due to this concern, I will be covering this topic.
How best to do that? Figure a half hour at most (including Q&A).
What I have surmised is that part of SOX can be covered in the DoD. And
part would be covered in the User Stories as Acceptance Criteria. I also
posit that the PO accepting the story might me sufficient for the user
acceptance tests, since the PO comes from the business.
1. Can someone point me to a few representative samples of these?
2. What else is there to know? Is there anything outside of stories
and DoD that is part of how agile handles SOX?
My goal is to get them to a) worry less because it’s handled and b) not
to hamstring themselves into some overkill process and c) make sure they
can survive or better, avoid an audit. (Indeed we have a project names
Phoenix here :).
Michael
--
linkedin.com/in/lagerweij | +31(0)6 28553506
'Rohan D'Sa' rohan.dsa@gmail.com [SCRUMDEVELOPMENT]
2015-10-22 09:09:26 UTC
Permalink
Sorry this link:
https://groups.google.com/d/msg/scrumalliance/nHkaWNE1HQ0/Yj1pj_weAQAJ
Post by 'Rohan D'Sa' ***@gmail.com [SCRUMDEVELOPMENT]
Great answer, Wouter!
Michael, I'd asked this question on the Scrum Alliance group a while back.
https://groups.google.com/forum/m/#!searchin/scrumalliance/Sarbanes/scrumalliance/nHkaWNE1HQ0
Groeten,
Rohan
Post by Wouter Lagerweij ***@lagerweij.com [SCRUMDEVELOPMENT]
Hi Michael,
I'm sure there's people here with more experience in this than I, but I
did work in a project at a large European insurance company where there was
a lot of focus on compliance and traceability. We had a number of sessions
with auditors about the process, and mostly they were very happy with what
happened in the scrum-ish-kanban-ish way of working.
There were a couple of things that we did that were mentioned as being
- The Definition of Done, as you said, was important to them. For us to
comply, we actually had the DoD as a real checklist, checked off and signed
by the Product Owner and, IIRC, at some point by one of the auditors (we
figured that if they wanted to audit the process, they'd better attend the
demo... :-) These signed DoDs were actually kept and archived as project
documentation.
- The test 'documentation': we used a BDD approach, with the outcomes of
conversations on requirements laid down in gherkin-style test scenarios.
This meant that for each 'requirement' (user story), we had a set of tests
linked. And since we automated those tests, we could show beyond doubt that
the requirement was actually implemented and tested. This was a huge hit
with the auditors, who came in expecting to have to slog through large sets
of 'test plan' spreadsheets, but ended up clicking through our generated
test-reports.
- A weekly (sprint-ly) report with the results of the sprint (just which
stories were done, mostly) was also needed as project documentation. Signed
by a scrum master, and the project/program manager in charge. We didn't do
anything with these documents, but again it was something that made the
auditors and surrounding company happy.
This was, as you might deduce from these points, not the most agile
environment. But we worked with a group of about 60 people (6 teams and
'overhead' like me), in weekly sprints, test driven, with a
pretty-continuous delivery set-up for about a year and with these kind of
measures exposed a more rigorous compliance than our formal caretakers had
been used to. They were a little reluctant in accepting an iterative
approach to the auditing at first, but that settled after a while.
hth,
Wouter
Post by Michael Wollin ***@mercurysw.com [SCRUMDEVELOPMENT]
Hi all,
Where I am coaching, there is a powerful and persistent worry and
preoccupation with SOX compliance. I am leading a couple of training
classes next week and due to this concern, I will be covering this topic.
How best to do that? Figure a half hour at most (including Q&A).
What I have surmised is that part of SOX can be covered in the DoD. And
part would be covered in the User Stories as Acceptance Criteria. I also
posit that the PO accepting the story might me sufficient for the user
acceptance tests, since the PO comes from the business.
1. Can someone point me to a few representative samples of these?
2. What else is there to know? Is there anything outside of stories
and DoD that is part of how agile handles SOX?
My goal is to get them to a) worry less because it’s handled and b) not
to hamstring themselves into some overkill process and c) make sure they
can survive or better, avoid an audit. (Indeed we have a project names
Phoenix here :).
Michael
--
linkedin.com/in/lagerweij | +31(0)6 28553506
Loading...