Dml Error Logging 10g
Contents |
Social Links Printer Friendly About Search 8i | 9i | 10g | 11g | 12c | 13c | Misc | PL/SQL | SQL | RAC | WebLogic | Linux Home » Articles » 10g » Here DML Error Logging in Oracle 10g Database dml error logging in oracle 11g Release 2 In some situations the most obvious solution to a problem is a DML statement
Oracle Dml Error Logging Performance
(INSERT ... SELECT, UPDATE, DELETE), but you may choose to avoid DML because of the way it reacts to exceptions. By default, when
Oracle Error Log Table 11g
a DML statement fails the whole statement is rolled back, regardless of how many rows were processed successfully before the error was detected. In the past, the only way around this problem was to process each row individually, preferably
Oracle Merge Log Errors Example
with a bulk operation using FORALL and the SAVE EXCEPTIONS clause. In Oracle 10g Database Release 2, the DML error logging feature has been introduced to solve this problem. Adding the appropriate LOG ERRORS clause on to most INSERT, UPDATE, MERGE and DELETE statements enables the operations to complete, regardless of errors. This article presents an overview of the DML error logging functionality, with examples of each type of DML statement. Syntax Restrictions Sample Schema Insert Update Merge Delete oracle log errors 11g Performance Syntax The syntax for the error logging clause is the same for INSERT, UPDATE, MERGE and DELETE statements. LOG ERRORS [INTO [schema.]table] [('simple_expression')] [REJECT LIMIT integer|UNLIMITED] The optional INTO clause allows you to specify the name of the error logging table. If you omit this clause, the the first 25 characters of the base table name are used along with the "ERR$_" prefix. The simple_expression is used to specify a tag that makes the errors easier to identify. This might be a string or any function whose result is converted to a string. The REJECT LIMIT is used to specify the maximum number of errors before the statement fails. The default value is 0 and the maximum values is the keyword UNLIMITED. For parallel DML operations, the reject limit is applied to each parallel server. Restrictions The DML error logging functionality is not invoked when: Deferred constraints are violated. Direct-path INSERT or MERGE operations raise unique constraint or index violations. UPDATE or MERGE operations raise a unique constraint or index violation. In addition, the tracking of errors in LONG, LOB and object types is not supported, although a table containing these columns can be the target of error logging. Sample Schema This following code creates and populates the tables necessary to run the example code in this article. -- Create and populate a source table. CREATE TABLE source ( id NUMBER(10) N
TECHNOLOGY: Performance Faster Batch Processing By Mark Rittman LOG ERRORS handles errors quickly and simplifies batch loading. When you need to load millions of rows of data into a table, the most efficient way is usually to oracle dml error logging 11gr2 use an INSERT, UPDATE, or MERGE statement to process your data in bulk. Similarly, error logging in oracle stored procedure if you want to delete thousands of rows, using a DELETE statement is usually faster than using procedural code. But what if dbms_errlog.create_error_log 11g the data you intend to load contains values that might cause an integrity or check constraint to be violated, or what if some values are too big for the column they are to be loaded into? https://oracle-base.com/articles/10g/dml-error-logging-10gr2 You may well have loaded 999,999 rows into your table, but that last row, which violates a check constraint, causes the whole statement to fail and roll back. In situations such as this, you have to use an alternative approach to loading your data. For example, if your data is held in a file, you can use SQL*Loader to automatically handle data that raises an error, but then you have to put http://www.oracle.com/technetwork/testcontent/o26performance-096310.html together a control file, run SQL*Loader from the command line, and check the output file and the bad datafile to detect any errors. If, however, your data is held in a table or another object, you can write a procedure or an anonymous block to process your data row by row, loading the valid rows and using exception handling to process those rows that raise an error. You might even use BULK COLLECT and FORALL to handle data in your PL/SQL routine more efficiently, but even with these improvements, handling your data in this manner is still much slower than performing a bulk load by using a direct-path INSERT DML statement. Until now, you could take advantage of the set-based performance of INSERT, UPDATE, MERGE, and DELETE statements only if you knew that your data was free from errors; in all other circumstances, you needed to resort to slower alternatives. All of this changes with the release of Oracle Database 10g Release 2, which introduces a new SQL feature called DML error logging. Efficient Error Handling DML error logging enables you to write INSERT, UPDATE, MERGE, or DELETE statements that automatically deal with certain constraint violations. With this new feature, you use the new LOG ERRORS clause in your DML statement and Oracle Database
of DML error logging; a major new feature of 10g Release 2 (10.2). DML error logging enables us to re-direct "bad data" to a log table during bulk DML operations without failing the overall statement. Readers http://www.oracle-developer.net/display.php?id=330 who are unfamiliar with this new feature are advised to read the referenced article for an overview before continuing. In this article, we will concentrate purely on the performance characteristics of DML error logging. In particular, http://www.rittmanmead.com/blog/2005/12/performance-issues-with-dml-error-logging-and-conventional-path-inserts/ we will examine the following: the general performance cost of invoking DML error logging; the cost of logging errors with the DML error logging feature; and the performance of DML error logging in comparison with error log its nearest related technology; FORALL SAVE EXCEPTIONS. invoking dml error logging We will begin by looking at the general costs associated with invoking DML error logging. Of course, costs are relative and it depends how we measure them; or rather, what we measure them against. We might decide that we absolutely must use this technology in our batch systems and are prepared to "take a hit" in performance to save potential dml error logging support costs that arise in the event of failure. In this article, of course, we define cost purely in performance terms: time and resources. We also assume that DML error logging will "cost" us something (nothing is for free) and we shall see if this is the case below. setup: sample data For our performance tests we will create a source and target table. The source table will be loaded with approximately 50,000 records. The target table will have various check constraints to make data violations simple to manufacture. Our test tables are created as follows. SQL> CREATE TABLE src 2 NOLOGGING 3 AS 4 SELECT ROWNUM AS n1 5 , created AS d1 6 , 'X' AS v1 7 , RPAD('x',100) AS v2 8 FROM dba_objects; Table created. SQL> SELECT COUNT(*) FROM src; COUNT(*) ---------- 50332 1 row selected. SQL> CREATE TABLE tgt 2 ( n1 NUMBER NOT NULL 3 , d1 DATE CHECK (EXTRACT(YEAR FROM d1) > 2004) 4 , v1 CHAR(1) CHECK (v1 IN ('X','Y')) 5 , v2 CHAR(100) 6 ); Table created. We will need an error log table (required for DML error logging), which is created via the DBMS_ERRLOG package as follows. SQL> BEGIN 2 DBMS_ERRLOG.CREATE_ERROR_LOG( 3 dml_table_name => 'TGT', 4 err_log_table_name => 'TGT_EXCEPTIONS' 5 ); 6
which allows you to add a LOG ERRORS clause to most DML statements so that rows that would otherwise cause the statement to fail are instead copied to an error logging table. A typical use of LOG ERRORS is like this: INSERT /*+ APPEND */ INTO table_a SELECT * FROM table_b LOG ERRORS REJECT LIMIT UNLIMITED; The rows that would cause an error are instead written to a table called ERR$_TABLE_A that gets created for you by the DBMS_ERRLOG.CREATE_ERROR_LOG procedure. I won't go into the full details here but Tim Hall has put together a nice write-up that you can view here. Anyway, for the article I put a set of examples together, where I used the data in the SH.SALES table to create a "source table" copy in a new schema, created another copy based off of this as a "target table", then introduced some errors in to the source table and constraints into the target so that I could show how the feature worked. Finally I used the DBMS_ERRLOG package to create the error logging table. SQL*Plus: Release 10.2.0.1.0 - Production on Sun Dec 4 12:38:54 2005 Copyright (c) 1982, 2005, Oracle. All rights reserved. Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production With the Partitioning, OLAP and Data Mining options SQL> create sequence sales_id_seq; Sequence created. SQL> create table sales_src 2 as 3 select sales_id_seq.nextval as "SALES_ID" 4 , prod_id 5 , cust_id 6 , channel_id 7 , time_id 8 , promo_id 9 , amount_sold 10 , quantity_sold 11 from sh.sales 12 / Table created. SQL> create table sales_target 2 as 3 select * 4 from sales_src 5 where 1=0 6 / Table created. SQL> alter table sales_src 2 modify promo_id null 3 / Table altered. SQL> select min(sales_id) 2 , max(sales_id) 3 from sales_src 4 / MIN(SALES_ID) MAX(SALES_ID) ------------- ------------- 1 918843 SQL> update sales_src 2 set promo_id = null 3 where sales_id in (5000,6000,7000) 4 / 3 rows updated. SQL> update sal