Wednesday, May 18, 2011

Options for Slow Performance Parsing Large XML

First let me state that this is a situational only solution and not a general solution to all performance problems when parsing large XML and inserting the data into the DB. I was brought onto a project to code the PL/SQL portion of a business process that would be kicked off a few times a year via a screen. This process would call a remote web service to see if a set of data had been updated. If that data had been updated since the last time, the code would call the web service again to retrieve all the information in XML format. This information would be parsed and stored into a table in the DB and a few automatic rules would be applied to update the data before the process finished and the user would need to perform manual steps to finish the larger process.

The XML that was being returned from the web service was in a simple format of
<resultset>
 <row>
   ...up to 16 nodes...
 </row>
 <row>
   ...up to 16 nodes...
 </row>
</resultset>

There were some mandatory nodes and many optional nodes. At the time of testing, there were 2,183 Row nodes being returned. The coding/testing was being done on 11.1.0.6.

Being lazy, I went the simple XMLTable method for parsing the XML and storing it into the table. This was done via the basic
INSERT INTO ... 
SELECT ... 
  FROM XMLTABLE(... PASSING l_result_xml ...);
where l_result_xml was defined as an XMLType.

During my testing, the total run time of this was taking 1459 seconds. While this was a seldom run process, it was still too slow to give to the client. Given that this PL/SQL code included a web service call, a MERGE and an UPDATE statement, I wasn't sure where the slowdown was so I used
dbms_utility.get_time;

to get some basic timing information. The time for a run broke down as

HTTP TIME in sec: 30.88
INSERT INTO TIME in sec: 1031.99
MERGE TIME in sec: 0.03
UPDATE TIME in sec: 0.15


The time spent dealing with the web service seemed reasonable and the MERGE/UPDATE time were great given the amount of SQL involved in them. The above INSERT INTO was killing performance though. Having a really good guess, I did an explain plan on the SQL statement and saw the dreaded
COLLECTION ITERATOR PICKLER FETCH

in the explain plan. All those memory operations on the XML were killing performance as expected.

For several reasons, registering a schema was not high on the list of options for better performance. My next approach was then to manually parse the XML in PL/SQL into memory and then use a FORALL to store that into the database. For that approach I setup a structure like (simplified for posting)
TYPE import_q_rec IS RECORD (statute          dbms_sql.Varchar2_Table,
                             end_dt           dbms_sql.Date_Table);
l_import_q_tab       import_q_rec;
l_result_xml         XMLTYPE;

WHILE l_result_xml.Existsnode('/ResultSet/Row[' || To_Char(l_row_index) || ']') > 0
LOOP
  l_temp_nd := l_result_xml.Extract('/ResultSet/Row[' || To_Char(l_row_index) || ']');
  l_import_q_tab.statute(l_row_index) := f_extractxmltypestrval(l_temp_nd, '/Row/STATUTE/text()', NULL);
  l_row_index := l_row_index + 1;
END LOOP;

FORALL i IN 1..l_row_index-1
  INSERT INTO import_queue
  VALUES
  (l_import_q_tab.statute(i), l_import_q_tab.descr(i));

This structure loops through each Row node in the XML and parses out the children nodes that exist. As some nodes were optional, the hidden logic within f_extractxmltypestrval is used to not throw errors if the desired node did not exist.

Once all this information was parsed and loaded into memory, the FORALL would store it into the table.

The first clean run through the code showed it was noticeably faster as the total run time was 37 seconds. The time for a run broke down as

HTTP TIME in sec: 32.43
PARSE XML TIME in sec: 3.59
FORALL TIME in sec: 0.47
MERGE TIME in sec: 0.06
UPDATE TIME in sec: 0.27


So the time spent parsing the XML and storing it into the table for the first run was nearly 1032 seconds and the above way took approximately 4 seconds. That was a good enough in terms of performance in regards to run frequency to call this method good.

So what did this show? When Oracle has to parse large amounts of XML that is stored as a CLOB or a PL/SQL XMLType variable, performance will not be great (at best). This is confirmed by Mark Drake via this XMLDB thread. If you are looking for performance increases, then you have to go other options. The above is one option that worked good for the situation at hand. One option is to register a schema in the database, create a table based off that schema, store the XML in there and parse it that way (if still needed). A second option would be to store the XML as binary XML (available in 11g). How those compare to the pure PL/SQL approach is a good question and hopefully I get around to looking into those soon.

Edit: June 9, 2011
As stated above, my testing was done on 11.1.0.6. In this OTN thread, see the answer from Mark (mdrake) regarding some internal performance changes made in 11.1.0.7 that would alter my original results. It should make the results faster so I'd be interested to see how much faster.