I am regularly given a dmp file that gets created using oracle v.11g (using the exp utility).
I used to import this file to the Western European edition of Oracle 10g XE. The import would terminate successfully without warnings but there was an error log (alert_xe.log) that would contstantly increase in size because I was using the the 32 Bit Oracle Database on a 64 Bit Windows OS.
I have now installed 11g XE and I am trying to import the same dmp file but I am seeing the following in the import log file:
import done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set
import server uses AL32UTF8 character set (possible charset conversion)
export client uses WE8ISO8859P1 character set (possible charset conversion)
and the import terminates with warnings as I have a lot of the following errors:
IMP-00019: row rejected due to ORACLE error 12899
IMP-00003: ORACLE error 12899 encountered
ORA-12899: value too large for column XXX (actual: 256, maximum: 255)
I understand that the cause of the problem is that the source database is using byte semantics and my new 11g XE database is using multibyte character set.
I have no control over the source database so I cannot change anything there.
Moreover I cant pre-create the tables with columns definitions using character length semantics over byte semantics (as indicated here for example Character set in Oracle 11g r2 XE) as somethimes the source database schema gets changed (columns might get added) and I am not notified so that breaks the import.
Is there a solution to this problem? Is there any way to use WE8MSWIN1252 with Oracle 11g XE?
I had the same issue. When I run this command
It was showing
That mean my oracle server is AL32UTF8 character set and export dump file client use AR8MSWIN1256 character set.
So I just changed the oracle character set to AR8MSWIN1256 by using following command.
Then run again
I hope this anwser will help someone