no as I mentioned, there is only two classifier.. one is source A and rest all be considered as Source B. Can you please elaborate on the process of elimination
Below I have an array of json coming from different data sources. and my job is to get ids for each item of json and pass them to different container based on what source it is coming. the way to differentiate here is that items coming from Source A will have properties Id,Project and ProjectId...
yes other posts are updated. MemoryStream and FileStream always worked and initially I was working with MemoryStream but blob stream was taken as to avoid memory exception for large size blobs
Finally, the issue is resolved now. It was a StreamReader which was setting the position of the stream to the end and thats why it was returning null and for some reason I am still unable to set Position to 0 but now manipulated the function and removed StreamReader before calling it in ReadCsv()
updated the code with latest readCsv() method. If someone with azure can test this from their end, reading blob as i did with DownloadBlob method and passing the stream to be read by csvHelper .
streams .Position = 0 was set as the first thing in the ReadCsv() method and i get object reference not set to an instance. though i see the valid position and length in the quick watch for the stream
streams .Position = 0 gives me null object reference not set. I put this at the beginning of the ReadCsv() method. Other discussion in other forums suggest that the stream is not supported in Csvhelper. Only memory and FileStream is supported. I cant ignore that but would like to know if someone...
Well it doesn't seems to be this time, If you see last time I was reading and passing MemoryStream and that worked fine with my readCSV() method. Now if there are large csv files then reading in Memory is not an ideal solution and that is why we are moving to read the Stream using OpenRead()...
Hi, Below is the code I am trying to read a csv and trying to copy the file in a tabular storage, but I receive the error as "No header record found".
Below is sample csv which i am trying to read
Sample CSv to be read:
PartitionKey;Time;RowKey;State;RPM;Distance;RespirationConfidence;HeartBPM...
I wrote my code something like this and I was able to read a .gz file while writing a test case. In Production however I am getting error as "Stream does not support reading. (Parameter 'stream')" on line I am using GZip. What wrong am i doing now??
public async Task<string> ReadStream(string...
I see here what you mean, now question is I see Unzipper has Read method which is close to what I am passing in my "data" variable? but i dont see that o be working if i pass like ths
using (var unzipper = new GZipStream(contents, CompressionMode.Decompress))
{...
Can you show me how to integrate that with my existing code_ The above class take path while I have the filename and the return will be a stream of data or the uncompressed file?
Below is the method where I am reading a csv file from an azure blob container and later calling a function to copy the contents in a tabular storage.
Now my requirement has bit changed and now .csv file will be compressed to .gz file in the blob container. I would like to know, how can I modify...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.