Premium

Java Certification Latest FAQ



Question: How to deal with convertible data in AWR tables (WRI$_%, WRH$_%, WRR$_%)?

Answer:The SYS schema contains a number of tables with names beginning with WRI$, WRH$%, andWRR$_, which comprise the Automatic Workload Repository (AWR). In addition to historical object statistics, this repository stores snapshots of vital system statistic, such as those visible in various fixed views, for example, V$SYSSTAT and V$SQLAREA.
If non-ASCII characters are used in object names or in SQL statements, for example in character literals or comments, they may get captured into the AWR tables. The DMU scan will report such characters as convertible data dictionary content, which prevents conversion of the database. To get rid of this data completely, recreate the Automatic Workload Repository by logging into SQL*Plus with SYSDBA privileges and running:
SQL> @?/rdbms/admin/catnoawr.sql
SQL> @?/rdbms/admin/catawr.sql
SQL> execute dbms_swrf_internal.register_local_dbid;


As the catawr.sql script is not present in Oracle Database versions 10.2.0.4 and earlier, Oracle recommends that you install the Oracle Database patch set 10.2.0.5 before purging AWR contents.


Question: Why do I get warnings for modifying columns under the Oracle E-Business Suite schemas?

Answer:Modifying the structures of Oracle E-Business Suite schemas is not supported as it may cause the Oracle applications to malfunction. You should only modify such columns if the affected table is a custom table created by you or if you have been advised to do so by Oracle Support.


Question: I just removed the characters causing invalid binary representation in my CLOB data cells in the cleansing editor. Why are they still highlighted as exceptions?

Answer:This is expected because rescanning larger data types on the client-side can be very expensive. The cleansing editor filtering and highlighting for CLOB, LONG, and XMLType columns are based on the most recent scan results for the table which won't change until you rescan the table/column.


Question: How does the DMU report data cells that exhibit both invalid binary representation and size expansion issues?

Answer:The DMU classifies each data cell under only one of the scan result categories. Values that have invalid binary representation issues are classified only as such even if their lengths also exceed column or data type limit after the conversion. The user is generally expected to resolve the invalid data issues and rescan the data before attempting conversion. Since the DMU also allows you to ignore the invalid data issues and force the conversion of a column, you should be aware that forcefully converted values with invalid binary representation may be additionally truncated. You can compare the value of the Maximum Post-conversion Length property of the column with the column and data type length limits to see if the truncation will take place.

Related Questions


Question: When starting, the DMU consistently picks up an older version of the JDK. How to specify the location of the correct JDK?

Question: Is the downtime needed to convert a database with the DMU directly proportional to the size of the database?

Question: The DMU reports invalid representation data in the table SYS.BOOTSTRAP$ in an Oracle Database 12.1.0.2 PDB. How to handle this data?

Question: Can I use the DMU if my database version is not listed in the supported configurations?

Question: Which JDK version is required to run the DMU client? JDK 6 or 7. Question: Can I run the DMU client remotely to migrate the database?

Question: What are the hardware requirements for a database server to migrate a database with the DMU?

Question: What are the hardware requirements for running the DMU client?