Tag Archives: Oracle

Unusable INDEX on PARTITION table

When index is unusable on table, it prevents the updating/inserting of new data

Everyday a clean up task would messed up a partitioned table. Specifically the following command would make the indexes unusable :


ALTER TABLE table_name DROP PARTITION partition_name;

Thus the table would have unusable indexes. This problem prevents from importing new data.

 

Description of the partitionned table

The table with unusable indexes is partitioned and has two indexes. One index is on the primary key and the other index on a date column MY_TABLE_DATE.
The table is partitioned with the method “Range Partitioning”

Range partitioning maps data to partitions based on ranges of partition key values that you establish for each partition. It is the most common type of partitioning and is often used with dates. For example, you might want to partition sales data into monthly partitions.

https://docs.oracle.com/cd/B10500_01/server.920/a96524/c12parti.htm

The table is partitionned with the date and has a primary key :


create table MY_TABLE
(
ID NUMBER GENERATED BY DEFAULT AS IDENTITY ,
MY_TABLE_DATE DATE NOT NULL,
NAME VARCHAR2(3) NOT NULL,
SURNAME VARCHAR2(20) NOT NULL,
PRIMARY KEY(ID))
)
PCTFREE 10
TABLESPACE tab_p_specified_space
PARTITION BY RANGE (MY_TABLE_DATE )
(PARTITION P_20100101 VALUES LESS THAN (TO_TIMESTAMP('2010/01/01 00:00:00','YYYY/MM/DD HH24:MI:SS') )
PCTFREE 10
TABLESPACE tab_p_specified_space
);

Actually this primary key is a problem according to Oracle documentation. Indeed the partitionned column should be part of the primary key. Read on from « Partitioned Index-Organized Tables » :

For partitioning an index-organized table
Partition columns must be a subset of primary key columns

https://docs.oracle.com/cd/B10500_01/server.920/a96524/c12parti.htm

To make the partitionned column part of the primary key , do something like this:


ALTER TABLE MY_TABLE DROP CONSTRAINT TABLE_PK;
ALTER TABLE MY_TABLE ADD CONSTRAINT TABLE_PK PRIMARY KEY (ID, MY_TABLE_DATE) USING INDEX LOCAL TABLESPACE tab_p_specified_space;

Unfortunately this modification of the partitioned table did not fix the problem either.

Solution

In order to fix unusable indexes I used the command “UPDATE INDEXES” suggested by this article https://docs.oracle.com/database/121/VLDBG/GUID-1D59BD49-CD86-4BFE-9099-D3B8D7FD932A.htm#VLDBG1122

“UPDATE INDEXES” will rebuild the indexes after the command DROP PARTITION is executed.


requestDeletePartition := 'ALTER TABLE BASE.MY_TABLE DROP PARTITION TO_CHAR(Date,'YYYYMMDD') || 'UPDATE INDEXES' ;

Unfortunately the result was the following error :

SQLCODE : -14048
SQLERRM : ORA-14048: a partition maintenance operation may not be combined with other operations

I spend sometimes trying to figure out the solution for this error . The solution is simple . I added a space between ‘ and UPDATE INDEXES like this :


requestDeletePartition := 'ALTER TABLE BASE.MY_TABLE DROP PARTITION TO_CHAR(Date,'YYYYMMDD') || ' UPDATE INDEXES' ;

The solution will rebuild indexes after dropping the partition. This fix is not sustainable for every scenario. Because it means to launch the command of rebuilding indexes everyday on a table with potentially millions of data. When the indexes are being rebuilt, the table cannot be used. Thus this command should be launched when nobody is using the application.

Annexes

Another tested solution

On google we can quickly find interesting articles how to fix the problem :


ALTER INDEX index_name REBUILD

https://docs.oracle.com/database/121/SPATL/alter-index-rebuild.htm#SPATL1017

This is a similar solution to the previous one. The table is not usable when the indexes are being rebuilt.

Other solution possible
I did not test this solution but it should fix the problem too.

https://www.toadworld.com/platforms/oracle/b/weblog/archive/2014/09/26/the-trick-of-drop-a-table-partition-without-impact-the-global-index

Advertisements

SQLLOADER has “ORA-01722: Number not valid ” with CSV File

The problem

While importing a csv file with sql loader i has this error :

Record 1: Rejected - Error on table MY_TABLE, column MY_COLUMN3 .
ORA-01722: Number not valid

Record 2: Rejected - Error on table MY_TABLE, column MY_COLUMN3 .
ORA-01722: Number not valid

Table MY_TABLE:
  1 Row successfully loaded.
  11 Rows not loaded due to data errors.
  0 Rows not loaded because all WHEN clauses were failed.
  0 Rows not loaded because all fields were null.

I spent hours on this problems .

My configuration

My CTL file :

OPTIONS (SKIP=0,ERRORS=1000,SILENT=(FEEDBACK))
LOAD DATA
APPEND
INTO TABLE MY_TABLE
FIELDS TERMINATED by ';'
TRAILING NULLCOLS
(
MY_COLUMN1 DECIMAL EXTERNAL
MY_COLUMN2 DECIMAL EXTERNAL
MY_COLUMN3 DECIMAL EXTERNAL
)

My table :

CREATE TABLE &1.."MY_TABLE"
       (                                                                                 
       ID                           NUMBER GENERATED BY DEFAULT AS IDENTITY,            
          MY_COLUMN1             FLOAT                      NOT NULL ,                 
MY_COLUMN2             FLOAT                      NOT NULL,
MY_COLUMN3             FLOAT                      NOT NULL,                 
    )                                                                                    
    PCTFREE 10                                                                           
    TABLESPACE "MY_TABLESPACE"

My CSV file :

3.62;0.091;11.2;-3.54
0.613;0.181;169.4;-10.18
..

The solution

First i checked what the error meant :
http://www.dba-oracle.com/t_ora_01722_sql_loader_sqlldr.htm

This error can mean there are blanks in the file but after checking the csv file there was none. However I noticed the last column MY_COLUMN3 causing problems had negative values. But according to oracle documentation the type Oracle FLOAT can handle negative values. And the ctl was correctly configured to handle negative values with MY_COLUMN3 DECIMAL EXTERNAL. So what is going on ?

Finally i discovered that after manually modifying the CSV file in the unix server the problem was solved ! So i figure out the problem was not coming from ctl file but from the csv file.

After many trials and errors I found out the file was a dos file

vi /data/myfile.csv
</data/myfile.csv" [noeol][dos] 13L, 832C

Actually this csv file is a file coming from Excel document on Windows and being transferred to the unix server.

So I tried the command dostounix and it fixes my issue !!!