Understand and use Oracle 8i analysis tool LogMiner

  • 2020-05-15 02:26:59
  • OfStack

The ORACLE tutorial you are looking at is: understanding and using Oracle 8i analysis tool LogMiner.

Oracle LogMiner is from product after 8 i Oracle company provides a very useful analysis tool, using the tool can easily gain Oracle redoing the log file (archive log files) in the specific content, in particular, this tool can analyze all database operations DML (insert, update, delete, etc.), it also can be analysis to get some necessary rollback SQL 1 statement. This tool is especially useful for debugging, auditing, or backing back a particular transaction.

The LogMiner analysis tool is actually made up of a set of PL/SQL packages and some dynamic views (part 1 of the Oracle8i built-in package). It is distributed as part of the Oracle database and is a completely free tool provided by the 8i product. However, this tool is a little more complex to use than other built-in Oracle tools, mainly because it does not provide any graphical user interface (GUI). This article details how to install and use the tool.

1. Use of LogMiner

The log file holds all database recovery data and records every change to the database structure, that is, all DML statements that operate on the database.

Prior to Oracle 8i, Oracle did not provide any tools to assist the database administrator in reading and interpreting the contents of the redone log files. There is something wrong with the system. For an ordinary data administrator, the only thing 1 can do is to package all the log files and send them to the technical support of Oracle company, and then quietly wait for the technical support of Oracle company to give us the final answer. However, since 8i, Oracle has provided such a powerful tool -LogMiner.

The LogMiner tool can be used to analyze either online or offline log files, redone log files in your own database, or redone log files in other databases.

Overall, the main USES of the LogMiner tool are:

1. Tracking database changes: you can track database changes offline without affecting the performance of the online system.

2. Fallback database changes: fallback specific change data, reducing the execution of point-in-time recovery.

3. Optimization and expansion planning: data growth patterns can be analyzed by analyzing the data in the log files.

2. Install LogMiner

To install the LogMiner tool, you must first run the following two scripts,

l $ORACLE_HOME rdbms/admin/dbmslsm sql

2 $ORACLE_HOME rdbms/admin/dbmslsmd sql.

Both scripts must be run as SYS users. The first script is used to create the DBMS_LOGMNR package, which is used to analyze log files. The second script is used to create the DBMS_LOGMNR_D package, which is used to create the data dictionary file.

3. Use the LogMiner tool

The following section details how to use the LogMiner tool.

1. Create data dictionary file (data-dictionary)

As already mentioned, the LogMiner tool is actually composed of two new PL/SQL built-in packages ((DBMS_LOGMNR and DBMS_LOGMNR_D) and four V$dynamic performance views (views created when DBMS_LOGMNR.START_LOGMNR starts LogMiner). You can use the DBMS_LOGMNR_D package to export the data dictionary as a text file before using the LogMiner tool to analyze the redo log file. The dictionary file is optional, but without it, parts of the data dictionary (table names, column names, etc.) and values interpreted by LogMiner would be in hexadecimal form and would not be directly understood. For example, the following sql statement:


LogMiner will explain the result as follows,


The purpose of creating a data dictionary is for LogMiner to refer to parts of the internal data dictionary as their actual names, rather than the system's internal hexadecimal. The data dictionary file is a text file created using the package DBMS_LOGMNR_D. If the tables in the database we are analyzing change and the data dictionary that affects the library changes, we need to recreate the dictionary file. In another case, when analyzing the redo log of another database file, it is also necessary to regenerate the data dictionary file of the database being analyzed once.

First, in the init.ora initialization parameters file, specify the location of the data dictionary file, that is, add a parameter UTL_FILE_DIR, and the parameter value is the directory where the data dictionary file is placed in the server. Such as:


Restart the database to enable the new parameters, and then create the data dictionary file:

[NextPage]

Create a list of log files to analyze

There are two types of redoing logs for Oracle, online (online) and offline (offline) archived log files. Below is a discussion of creating a list of these two different log files.

(1) analyze online redo log files

A. Create a list


B. Add other log files to the list


(2) analyze offline log files

A. Create a list


B. Add additional log files to the list


It is entirely up to you to determine how many log files you need to analyze in this list, but it is recommended that you only add one log file at a time, and then add another file after analyzing the file.

The procedure 'dbms_logmnr.removefile' can also be used to remove one log file from the list, as opposed to adding the log analysis list. The following example is removed from the log file e:\Oracle\oradata\ redo02.log added above.


After creating a list of log files to analyze, you are ready to analyze them.

3. Log analysis using LogMiner

(1) unlimited conditions


(2) there are restrictions

By setting several different parameters in the procedure DBMS_ LOGMNR.START_LOGMNR (the meanings of the parameters are shown in table 1), you can narrow down the scope of the log files to be analyzed. By setting the start and end time parameters, we can limit the logs to be analyzed in a 1-time range. In the following example, we only analyzed the log dated September 18, 2001:


You can also limit the scope of the log to be analyzed by setting start SCN and end SCN:


Table 1. Meanings of process parameters of DBMS_LOGMNR.STARTs 221en

[NextPage]

4. Observation and analysis results (v$logmnr_contents)

So far, we've analyzed the contents of the redone log file. The dynamic performance view v$logmnr_contents contains all the information from LogMiner analysis.


If we just want to know what a user is doing to a table, we can do this with the SQL query below, which gives us the one-cut work that user DB_ZGXT is doing to the table SB_DJJL.


One point to emphasize is that the analysis results in view v$logmnr_contents only exist during the lifetime of the session we are running, 'dbms_logmrn.start_logmnr'. This is because all of the LogMiner storage is in PGA memory, which is invisible to all other processes, and the analysis results disappear as the process ends.

Finally, the use procedure DBMS_LOGMNR.END_LOGMNR terminates the log analysis transaction, at which point the PGA memory area is cleared and the analysis result no longer exists.

4. Other matters needing attention

We can use the LogMiner log analysis tool to analyze the redone log files generated by other database instances, not just the redo logs files of the database instances where LogMiner is installed. When analyzing other database instances using LogMiner, there are a few things to note:

1. LogMiner must use the dictionary files generated by the analyzed database instance, rather than the dictionary files generated by the LogMiner database, and must also ensure that the LogMiner database's character set installed is the same as the character set of the analyzed database.

2. The analyzed database platform must be the same as the current LogMiner database platform 1. That is to say, if the file to be analyzed is generated by Oracle 8i running on UNIX platform, LogMiner must also run on an Oracle instance running on UNIX platform, and LogMiner running on other Microsoft NT platforms. Of course, the hardware conditions of both do not have to be exactly the same.

3. The LogMiner log analysis tool can only analyze products after Oracle 8, and it can do nothing for products before 8.

5. Conclusion

LogMiner for database administrators (DB)

[1] [2] next page

The ORACLE tutorial you are looking at is: understand and use Oracle 8i analysis tool LogMiner. A) is a very powerful tool, which is often used in daily work. With this tool, you can get a lot of information about database activities. One of the most important USES is to restore a change to a database without restoring the entire database. In addition, the tool can be used to monitor or audit user activity. For example, you can use the LogMiner tool to see who has changed the data and the state of the data before the change. We can also use this tool to analyze any redone log files generated by Oracle 8 and later. Another important feature of the tool is the ability to analyze log files from other databases. In a word, this tool is a very effective tool for database administrators. A deep understanding and proficiency in this tool is very helpful for the actual work of every database administrator.

On 1 page

Previous page [1] [2]


Related articles: