Iterations - Speed problems

Joined
Aug 26, 2018
Messages
5
Programming Experience
10+
Hi,

I am struggling with speed-issues regarding iterations. My app reads lines from files into memory. Then the strings are processed and the content is condensed and stored in a new file. It is a bit like finding pairs and put everything in common. Information from few lines are insert into a new line. One row has at least one string I store in a user-defined class. The object is added to a list. There are some files which have more than 8 million records.

How should I explain that correctly
C#:
private List<NewAWithBs> CreateAWithBsStructure(List<AAndBs> SG)        {
            List<NewAWithBs> Output = new List<AWithBs>();


            List<AAndBs> MIRG = new List<AAndBs>();


            var As = SG.GroupBy(x => x.A.Trim()).Select(x => x.FirstOrDefault());                             // Just to get the unique items. For each unique item the corresponding B values are added


            for (int i = 0; i <= As.Count() - 1; i++)
            {
                NewAWithBs mm = new NewAWithBs();
                mm.A = As.ElementAt(i).A;


                for (int j = 0; j <= SG.Count() - 1; j++)
                {
                    if (SG[j].A.ToUpper().Trim() == mm.A.ToUpper().Trim())
                    {
                        mm.Bs.Add(SG[j].B);
                        mm.Values.Add(SG[j].V);
                    }
                }


                Output.Add(mm);
            }


            return Output;
        }
That's like an example. The filtered list could have let's say 2,600 entries. And SG 8, 9 million or even much more. That takes a wile. There is the other way round. When B is filtered a list could contain 30,000 unique entries. To do both has been taken more than 23 hours. The process takes 2.6 GB RAM now. What can I do the increase the speed? There a still a lot of larger files to process.
 
Back
Top Bottom