Hi,
After having merged the infrastructure to check data validity in an
existing
database before applying an upgrade script, I'm reviewing existing
scripts to see if they may qualify to be enhanced with a check and user
intervention.
One script which qualifies is 1.5/invoice-tbl-cogs-constraint.sql which
adds a constraint verifying that the COGS routines post data which makes
sense.
Existing data does not necessarily comply with the new constraint. I
wonder though what corrective action to offer a user who is the proud owner
of non-compliant data. I see a number of options:
1. Tell the user to contact the mailing list (and block his migration in
the mean time)
That may be necessary if other things fail anyway.
Sure, but it's not the best way to bind people to our software and
community in the sense that if we want to inspire confidence and awe,
asking them to contact us to "unfiddle" their data might not be a move we
want to resort to. (But it *is* an option.)
2. Modify the data to be within the bounds to be
enforced (and thereby
loose the COGS audit trail?
Modifying data is of course dangerous. The question here is what we mean
by modifying the data. Are we talking about deleting data? or some other
modifications?
Modifying data in the sense that the value of the "allocated" field falls
outside the boundaries of what's allowed, reducing (or increasing) the
number in such a way that it's within bounds again will allow us to apply
the constraint. It won't change the COGS values already posted, but I'm not
sure that such modifications will result in the same values being posted
going forward (as without the modifications). Maybe you can confirm that
changing the 'allocated' amount on "saturated" purchase invoices will
(or
won't) change the outcome for newer COGS calculations?
3. Allow constraint creation to fail by modifying
Loadorder and create a
new script which creates a "not valid"
constraint (which allows the
existing data to remain in place, but verifies new data)
You could also create a constraint that checks if data is old and passes
it too.
Ok. But what criterion determines that "data is old"?
4. Create a check script which creates the "not
valid" constraint and
fakes script application by modifying dbchanges
tables
I don't see what this gets us.
I was thinking this is the "cheap" (or rather: "easy on the
programmer")
version of "data is old"; it doesn't require us to recalculate anything, it
simply allows us to apply the constraint. New data will be just as safely
protected as on any other database, but old data won't need to change.
5. Adjust the data back into bounds, allocating the
overage to more recent
invoices and creating a COGS adjustment journal
which the admin must post
This would get my vote.
Ok. I'm thinking such a procedure would work as follows:
1. A query determines the violating rows
2. For each row, the delinquent amount (overage) is calculated and with it,
the COGS amount (money) to be reversed
3. For each delinquent amount (overage), the COGS calculation routines
calculate the new cost to be posted, based on available FIFO amounts
4. The user is presented with a GL transaction, to be posted on the
migration date (or another, selectable, posting date)
5. The user elects to proceed, causing the proposed GL transaction to be
posted.
I'm now running into an issue though: our COGS routines destructively
modify existing data. This means that in order to propose the GL
transaction above (and calling the COGS routines), there's no way *not* to
accept the posted transaction (because the underlying data has already been
destructively modified to include that transaction).
Which solutions do we have here? I'm seeing basically just one option:
1. Calculate the full transaction, post it and *inform* the user of the
posted data, instead of asking for their permission.
This step has the huge downside that the user doesn't get to choose on
which date the correction journal gets posted. Any other options?
--
Bye,
Erik.
http://efficito.com -- Hosted accounting and ERP.
Robust and Flexible. No vendor lock-in.