Premium

Java Certification Latest FAQ



Question: I set the JDK location incorrectly when starting the DMU. How can I change the JDK location?

Answer:DMU stores the java executable path in ~/.dmu_jdk for Unix-based platforms. If you remove this file, the tool will ask for the full path again. If the java executable is found in PATH orJAVA_HOME is defined, then it will not prompt the user. On Microsoft Windows, you can edit the JDK location in the file dmu\dmu\bin\dmu32.conf (DMU with 32-bit JDK) or in the filedmu\dmu\bin\dmu64.conf (DMU with 64-bit JDK) under the DMU installation folder. Look for the keyword SetJavaHome.


Question: What is the recommended strategy for cleansing invalid data?

Answer:Invalid data is often a result of storing data in a different encoding than the database character set using the pass-through configuration which bypasses the client/server character set conversion. In order to migrate such data correctly, you must identify the actual encoding used for these data. The DMU character set tagging feature allows you to analyze columns containing invalid data by re-rendering them in different character sets. After the actual character set is confirmed and tagged to the column, it will be used in all subsequent data scanning and conversion operations. If you believe all data in the database is stored in a different character set, you can set it in the "Assumed Database Character Set" field in the database property tab (e.g., storing WE8MSWIN1252 data in a WE8ISO8859P1 database).
It is also possible that the invalid data is caused by application bugs or storing binary data in character columns. Please see Chapter 6 of the Users' Guide for more cleansing scenarios.


Question: What is the recommended strategy for cleansing data expansion issues?

Answer:When migrating non-ASCII data to Unicode, the resulting data may expand in size due to their multi-byte representations in Unicode. The data expansion issues can manifest either as over column limit issues or over data type limit issues.
For "exceed column limit" issues, you have the following options:
• Lengthen the column
• Change the length semantics of the column from bytes to characters
• Shorten the stored values manually
• Allow DMU to truncate the values during conversion
• Edit the value to replace characters that expand in conversion
• Migrate to a larger data type
For "exceed data type limit" issues, the options are:
• Migrate to a larger data type
• Shorten the stored values manually
• Allow DMU to truncate the values during conversion
• Edit the value to replace characters that expand in conversion
Cleansing actions involving changes to column definitions may not be suitable to be performed on production environments since they typically require corresponding updates in the application code logic. DMU provides scheduled cleansing actions for such cases so that the changes can be saved in the DMU repository for execution later during the conversion phase as part of the downtime window. To define a scheduled cleansing action, select “Schedule Column Modification…� from the cleansing editor context menu on the target column.


Question: What are the DMU conversion criteria for the data dictionary?

Answer:In general, the DMU does not support converting data dictionary in this release if there is convertible data in data dictionary tables, except for the following:
• CLOB columns – this is necessary only in a single-byte database
• Binary XML token manager tables, with names like XDB.X$QN% and XDB.X$NM%
• PL/SQL source code: text of CREATE PROCEDURE, CREATE FUNCTION, CREATE PACKAGE, CREATE PACKAGE BODY, CREATE TYPE BODY, CREATE TRIGGER, andCREATE LIBRARY; type specifications (CREATE TYPE) are not converted
• View definitions: text of CREATE VIEW
• The columns:
o SYS.SCHEDULER$_JOB.NLS_ENV – NLS environment for Database Scheduler jobs (DBMS_SCHEDULER)
o SYS.SCHEDULER$_PROGRAM.NLS_ENV – NLS environment for Database Scheduler job programs (DBMS_SCHEDULER)
o SYS.JOB$.NLS_ENV – NLS environment for legacy jobs (DBMS_JOB)
o CTXSYS.DR$INDEX_VALUE.IXV_VALUE – attribute values of Oracle Text policies
o over 50 different columns in SYS, SYSTEM, and CTXSYS schemas that contain user comments for various database objects
The PL/SQL source code and the view source text are kept in multiple tables. The DMU checks the following columns when processing the source code and view definitions:
• SYS.VIEW$.TEXT – view definition text
• SYS.SOURCE$.SOURCE – PL/SQL and Java source code
• SYS.ARGUMENT$.PROCEDURE$ – PL/SQL argument definitions: procedure name
• SYS.ARGUMENT$.ARGUMENT – PL/SQL argument definitions: argument name
• SYS.ARGUMENT.DEFAULT$ – PL/SQL argument definitions: default value
• SYS.PROCEDUREINFO$.PROCEDURENAME – names of procedures and functions declared in packages
• SYS.IDL_CHAR$.PIECE – internal representation of PL/SQL
• SYS.PLSCOPE_IDENTIFIER$.SYMREP – internal representation of PL/SQL; this table is new Oracle Database 11g
The DMU does not report convertible character data in the tables and columns listed above as a convertibility issue. Any convertible data in the remaining tables and columns of the data dictionary is flagged as a convertibility issue in scan reports and on the Migration Status tab. The database conversion step cannot be started before the flagged data is removed. Cleansing operations are not allowed on data dictionary tables.

Related Questions


Question: Why do I sometimes get performance warnings when applying filters in the cleansing editor?

Question: The migration status tab says "The current setting rules out CTAS conversion method for tables with Row Movement disabled". What does it mean?

Question: How do I monitor the table-level conversion progress in the conversion phase?

Question: I got ORA-12721 during altering database character set to Unicode in the conversion phase. How can I diagnose the offending session?

Question: A DMU scan fails on some tables with "ORA-29913: error in executing ODCIEXTTABLEOPEN callout". How to resolve this?

Question: Why does DMU report Unicode replacement characters as invalid in the validation mode?

Question: I have fixed some data issues. How come the DMU status icons still show the affected objects as not ready for conversion?

Question: Does the DMU have a rollback feature? The DMU does not offer any conversion rollback feature per se but it comes with built-in conversion error handling such that if the conversion process is interrupted by an error condition, it is possible to resolve the issue and resume the conversion. If you really need to rollback your database to the state before the conversion, you can restore from backup or use the flashback database feature. Note: The flashback database feature has not been tested to work across the ALTER DATABASE CHARACTER SET (ADBCS) statement. While the design of the feature should not conflict with ADBCS, it is recommended that you choose restoring from backup if ADBCS has already been performed during the conversion process, that is, the query SELECT value FROM nls_database_parameters WHERE parameter='NLS_CHARACTERSET' shows the target character set. If you have no backup available and you are forced to try flashback database after ADBCS, make sure you restart the instance immediately after the flashback command. See Oracle Database Backup and Recovery User's Guide for more information onFLASHBACK DATABASE and its requirements prior to starting the conversion process. Question: Some DMU operations appear to hang or take unusually long on my database. What could cause this?