Question Needed help making API pulls faster


New member
Mar 5, 2022
Programming Experience
Hello everyone,

I unfortunately am working with a API from a vendor where they won't make a change to it to make the pull I need to do faster and more efficient. In short, the API gets sales by store for a data range using store ID, begin and end date parameters so I am stuck having to basically call the API hundreds of times with different store ID parameters to get the data I need. I am then taking the response as a stream and deserializing it before loading it to a SQL DB. Ideally I'd like to not have to send a store ID at all and just do one pull with the needed date range. As I mentioned before vendor won't or doesn't had the resources to make the dev change to the API.

As you can imagine hundreds of API calls and DB loads is very slow, but I have to work with the API the way it is. My code gets all store IDs needed and passes my date range in, returns the response, deserializes to a object then inserts to DB store by store. What I'd like to do to try and speed up the process is perform all the API calls (hundreds) append to stream once, deserialize once and perform 1 DB insert. Right now the last three steps are one at a time by store. The part that I need help with is appending all the calls to one stream (if this can even be done).

I'd then deserialize it once and perform one insert.

string urlParams = String.Format("MyAPIPAth",Variables.CompanySequence.ToString(), fromDate.ToString("MM/dd/yyyy"), toDate.ToString("MM/dd/yyyy"));

     //This is inside a loop of stores     
        catch (Exception ex)
            // Error handling

static async Task RunAsync(string urlParams)
    { //static async Task<List<Sales_Export>> RunAsync(int apiMethod, string urlParams)
        using (var client = new HttpClient(new HttpClientHandler { AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate }))
            client.BaseAddress = new Uri("MyURL");
            client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
            var streamresponse = await client.GetStreamAsync(urlParams);
            StreamReader sr = new StreamReader(streamresponse);
            JsonReader reader = new JsonTextReader(sr);
            JsonSerializer serializer = new JsonSerializer();

            salesExport = serializer.Deserialize<List<Sales_Export>>(reader);



I feel like there should be a way to append streamresponse from the hundreds off API calls I make to the StreamReader. I think that would allow the rest of the code to deserialize once. My DB insert (not shown) would also be once. Any help wouls be greatly appreciated.

Last edited:
You cannot just blindly append the stream data sequentially because JSON requires that all structures be closed.

I suggest running a profiler first to determine where the actual bottleneck is at before assuming that it's the multiple deserializations which is expensive, or the multiple database writes is slow.
As an aside you are using HttpClient incorrectly. Read the documentation. The recommendation is to create just one instance for the lifetime of your application.
Anyway, since you are already using async, you can do some overlapping operations by asynchronously downloading from the API, an asynchronously writing to the database.
Top Bottom