Memory Leak with Oracle Insert Statement...

cboshdave

Member
Joined
Dec 11, 2014
Messages
24
Programming Experience
1-3
As my program steps through a dictionary and inserts records, my memory usage creeps up until I get a failure in my code. I am opening my connection to Oracle at the beginning and then just reuse the same connection during the entire process. So, calling the following code with a large insert statement rougly 100,000 times causes an Out of Memory exception. I am not sure how to reset things??

C#:
        public int ExecuteNonQuery(string sql)        {
            try
            {
                //conn.Open();
                int affected;
                //Debug.WriteLine("Connection: " + this.conn.InstanceName);
                OracleCommand cmd = conn.CreateCommand();
                cmd.CommandText = sql;
                cmd.Connection = conn;
                affected = cmd.ExecuteNonQuery();
                //mytransaction.Commit();
                return affected;
            }
            catch (Exception ex)
            {
                Console.WriteLine("OracleDataAccessManager.ExecuteNonQuery(): " + ex);
                throw ex;  //Acts the same as a return. 
            }
            finally
            {
                //conn.Close();  //Not using this section
            }


        }
 
Have you considered populating a DataTable with all the data and then using a data adapter to save the lot as a batch? Using a string that includes data like that is generally dodgy anyway.
 
I did further testing. It does not seem to be this block of code. It actually seems to be the array that is built to create the string sql statement. I commented the insert statement call and the memory still creeps up. I am building an array with all the values that go into the sql statement. With the following code:

string[] sItems = new string[92];
//Define all sItems now for insert based on Dictionary.
int i = 0;
sItems = dictFind(sID, "GB"); i++; //field 1
sItems = "000"; i++; //field 2
sItems = dictFind(sID, "ZD"); i++; //field 3
sItems = dictFind(sID, "D2"); i++; //field 4
...

Then, I use the values to build the sql statement. I read some about arrays and it looks like the Garbage Collector should be cleaning this up?? I am quite certain this is where it is "leaking". Is there something I can do to clean this up? Part of the iteration is to load the array, then, build sql statement and insert. Then, read next line of text file, load array again and then insert...
 
I'd use a using statement for my connection and then wrap your other statements inside. I'm pretty sure the using statement will close/dispose of your connection automatically.
 
Back
Top Bottom