System.Diagnostics.Tracing.EventListener increasing CPU usage

truthz03

New member
Joined
Dec 15, 2020
Messages
3
Programming Experience
10+
Hi,

I'm using the EventListener inside my netcoreapp3.1 application which is running inside a Linux Docker container.
The strange thing is that the CPU usage of my app increases over time although the app is not doing anything else.
If I comment out the EventListener I did'nt get an increasing CPU usage.

This is my code:
EventListener:
public class EnvironmentInfoService: EventListener, IEnvironmentInfoService
{
    private readonly ILogger<EnvironmentInfoService> m_logger;
    private readonly IOptions<EnvironmentInfoOptions> m_options;

    private readonly IList<EventSource> m_missedEvents = new List<EventSource>();
    private static readonly Dictionary<string, object> m_eventCounters = new Dictionary<string, object>();
    private static readonly Stopwatch m_watch = Stopwatch.StartNew();
    private static int m_eventsReceived = 0;

    public EnvironmentInfoService(ILogger<EnvironmentInfoService> logger, IOptions<EnvironmentInfoOptions> options)
    {
        m_logger = logger;
        m_options = options;

        lock (m_missedEvents)
        {
            foreach (var source in m_missedEvents)
            {
                OnEventSourceCreated(source);
            }
        }
    }

    protected override void OnEventSourceCreated(EventSource source)
    {
        if (m_options == null)
        {
            lock (m_missedEvents)
            {
                m_missedEvents.Add(source);
            }
        }
        else
        {
            if (m_options.Value.EnableEventCounters)
            {
                m_logger.LogInformation("Enable Event Source: '" + source.Guid + "/" + source.Name + "' EventCounterIntervalSec: " + m_options.Value.EventCounterIntervalSec);
                EnableEvents(source, EventLevel.Verbose, EventKeywords.All, new Dictionary<string, string>()
                {
                    ["EventCounterIntervalSec"] = m_options.Value.EventCounterIntervalSec.ToString()
                });
            }
        }
    }

    protected override void OnEventWritten(EventWrittenEventArgs eventData)
    {
        lock (m_watch)
        {
            m_eventsReceived++;
            if (m_watch.ElapsedMilliseconds > 1000 * 60 * 10)
            {
                m_logger.LogInformation("OnEventWritten: " + m_eventsReceived + " in " + m_watch.Elapsed.ToString());
                m_watch.Restart();
                m_eventsReceived = 0;
            }
        }

        if (!eventData.EventName.Equals("EventCounters"))
        {
            return;
        }

        lock (m_eventCounters)
        {
            m_logger.LogInformation("Source: '" + eventData.EventSource.Guid + "/" + eventData.EventSource.Name + "/" + eventData.Level + "/" + eventData.Keywords + "' Event: '" + eventData.EventId + "/" + eventData.EventName + "' EventCounters received: " + eventData.Payload.Count);
            for (int i = 0; i < eventData.Payload.Count; ++i)
            {
                if (eventData.Payload[i] is IDictionary<string, object> eventPayload)
                {
                    var (counterName, counterValue) = GetRelevantMetric(eventPayload);
                    m_eventCounters[counterName] = counterValue;
                    m_logger.LogInformation(i + " Source: '" + eventData.EventSource.Guid + "/" + eventData.EventSource.Name + "' Event: '" + eventData.EventId + "/" + eventData.EventName + "' EventCounter '" + counterName + "' received with value: " + counterValue);
                }
            }
        }
    }

    private static (string counterName, string counterValue) GetRelevantMetric(
    IDictionary<string, object> eventPayload)
    {
        var counterName = "";
        var counterValue = "";

        if (eventPayload.TryGetValue("DisplayName", out object displayValue))
        {
            counterName = displayValue.ToString();
        }
        if (eventPayload.TryGetValue("Mean", out object value) ||
            eventPayload.TryGetValue("Increment", out value))
        {
            counterValue = value.ToString();
        }

        return (counterName, counterValue);
    }
}

I have already added some logs to analyze the problem but didn't found any reason why the CPU usage increases and increases.
In 2 days the CPU usage increases from 0.7% to 1.1%.
This is not very much but I have 30 apps with this EventListener running on this machine so the total CPU usage increases from about 20% to 35% in 2 days and to about 70% after a week.

The EnableEvents method will be called about 10 times after starting the app for the different sources (no dupplicates).
The OnEventWritten method will be called about 1000-2000 every minute
The EventCounterIntervalSec is 5min and I really get EventCounters only every 5min

This Event "Source: '9ded64a4-414c-5251-dcf7-1e4e20c15e70/Microsoft.AspNetCore.Hosting/LogAlways/None' Event: '-1/EventCounters'" I will get 4 times every 5min which includes the following counter names:
Request Rate
Total Requests
Current Requests
Failed Requests

This Event "Source: '49592c0f-5a05-516d-aa4b-a64e02026c89/System.Runtime/LogAlways/None' Event: '-1/EventCounters'" I will get 19 times every 5min which includes the following counter names:
Monitor Lock Contention Count
ThreadPool Queue Length
ThreadPool Completed Work Item Count
% Time in GC since last GC
Gen 0 Size
Gen 1 Size
Gen 2 Size
LOH Size
Allocation Rate
Number of Assemblies Loaded
Number of Active Timers
Working Set
GC Heap Size
Gen 0 GC Count
Gen 1 GC Count
Gen 2 GC Count
Exception Count
ThreadPool Thread Count
CPU Usage


Any ideas what I can do to not getting an increasing CPU usage?

Thanks Thomas
 
How exactly are you commenting out the EventListener?
 
Also: Have you considered what the downstream effects of writing to your m_logger are? E.g. Does it append to a file? Is the OS doing extra work as that file gets longer and longer? Is there AV software that scans modified files and the bigger the file, the more data to scan? Is the logger keeping an in memory list instead of a circular buffer? Is that in memory list getting bigger and bigger?
 
Also: Have you considered what the downstream effects of writing to your m_logger are? E.g. Does it append to a file? Is the OS doing extra work as that file gets longer and longer? Is there AV software that scans modified files and the bigger the file, the more data to scan? Is the logger keeping an in memory list instead of a circular buffer? Is that in memory list getting bigger and bigger?

The logger is a Elasticsearch sink so the log will be send to another Docker container and there it will be written to a db.
But the problem is the same without all these logging stuff. I added all the m_logger.... lines to find the problem. The problem was there before I add these lines.
 
Back
Top Bottom