We frequently gets requests from our customer to do mass updates. In the past, I would just take the data and generate a SQL script from it. Seems legit. This worked up to the point where we started getting too much data. Then I started splitting the jobs up into multiple scripts. Things got out of hand when I needed 15 or more scripts to do the job. Each script itself took a long time time run.
External tables came to the rescue. Now I put the data from the customer in a file on the server. Then I create an external table backed by that file. SQL*Loader does the job of reading in that file and exposing it to me like a normal database table. Now I can't do any updates on this table. However I don't need to. I just use it as input to update the database. I loop a cursor through the records, make my updates, and the job is done.
The beauty of this solution is that it is very fast. The script technique took a long time to execute when the scripts were large. The external table technique hardly takes any time at all.
Reproducing a Race Condition
-
We have a job at work that runs every Wednesday night. All of a sudden, it
aborted the last 2 weeks. This caused some critical data to be late. The
main ...