Resolved CSVHelper throws error as "No header record found"

Ankit

Member
Joined
Dec 16, 2020
Messages
17
Programming Experience
3-5
Hi, Below is the code I am trying to read a csv and trying to copy the file in a tabular storage, but I receive the error as "No header record found".
Below is sample csv which i am trying to read

C#:
Sample CSv to be read:

PartitionKey;Time;RowKey;State;RPM;Distance;RespirationConfidence;HeartBPM
te123;2020-11-06T13:33:37.593Z;10;1;8;20946;26;815
te123;2020-11-06T13:33:37.593Z;4;2;79944;8;36635;6
te123;2020-11-06T13:33:37.593Z;3;3;80042;9;8774;5
te123;2020-11-06T13:33:37.593Z;1;4;0;06642;6925;37
te123;2020-11-06T13:33:37.593Z;6;5;04740;74753;94628;21
te123;2020-11-06T13:33:37.593Z;7;6;6;2;14;629
te123;2020-11-06T13:33:37.593Z;9;7;126;86296;9157;05
te123;2020-11-06T13:33:37.593Z;5;8;5;3;7775;08
te123;2020-11-06T13:33:37.593Z;2;9;44363;65;70;229
te123;2020-11-06T13:33:37.593Z;8;10;02;24666;2;2



C#:
    public Type DataType
        {
            get
            {
                switch( Type.ToUpper() )
                {
                    case "STRING":
                        return typeof(string);

                    case "INT":
                        return typeof( int );

                    case "BOOL":
                    case "BOOLEAN":
                        return typeof( bool );

                    case "FLOAT":
                    case "SINGLE":
                    case "DOUBLE":
                        return typeof( double );

                    case "DATETIME":
                        return typeof( DateTime );

                    default:
                        throw new NotSupportedException( $"CSVColumn data type '{Type}' not supported" );
                }
            }
        }
// This is the method where the stream is passed and the error is received at line "csv.ReadHeader()"
private IEnumerable<Dictionary<string, EntityProperty>> ReadCSV(Stream source, IEnumerable<TableField> cols)
        {
            var size = source.Length;
            source.Position = 0;
            using (TextReader reader = new StreamReader(source, Encoding.UTF8))
            {
                var cache = new TypeConverterCache();
                reader.ReadLine();
                cache.AddConverter<float>(new CSVSingleConverter());
                cache.AddConverter<double>(new CSVDoubleConverter());
                var csv = new CsvReader(reader,
                    new CsvHelper.Configuration.CsvConfiguration(global::System.Globalization.CultureInfo.InvariantCulture)
                    {
                        Delimiter = ";",
                        HasHeaderRecord = true,
                        CultureInfo = global::System.Globalization.CultureInfo.InvariantCulture,
                        TypeConverterCache = cache
                    });
                csv.Read();
                csv.ReadHeader();

                var map = (
                        from col in cols
                        from src in col.Sources()
                        let index = csv.GetFieldIndex(src, isTryGet: true)
                        where index != -1
                        select new { col.Name, Index = index, Type = col.DataType }).ToList();

                while (csv.Read())
                {
                    yield return map.ToDictionary(
                        col => col.Name,
                        col => EntityProperty.CreateEntityPropertyFromObject(csv.GetField(col.Type, col.Index)));
                }

            }

        }
// This is the method from where the stream is being returned to ReadCsv() method above
public async Task<Stream> ReadStream(string containerName, string digestFileName, string fileName, string connectionString)
        {
            string data = string.Empty;
            string fileExtension = Path.GetExtension(fileName);
            var contents = await DownloadBlob(containerName, digestFileName, connectionString);             
              
          
            return contents;
        }

// method where blob is read as stream
public async Task<Stream> DownloadBlob(string containerName, string fileName, string connectionString)
        {
          
            Microsoft.Azure.Storage.CloudStorageAccount storageAccount = Microsoft.Azure.Storage.CloudStorageAccount.Parse(connectionString);
            CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
            CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
            CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
            if (!blob.Exists())
            {
                throw new Exception($"Unable to upload data in table store for document");
            }
          
            return await blob.OpenReadAsync();
          
        }
 
Last edited:

Ankit

Member
Joined
Dec 16, 2020
Messages
17
Programming Experience
3-5
Glad you resolved it. I hope your taking time to update all your other crossposts so that other people aren't wasting their time still trying to answer your question.

Hopefully someday you'll also explain why having the StreamReader worked when you were using a MemoryStream or a FileStream, but not the Azure blob stream.
yes other posts are updated. MemoryStream and FileStream always worked and initially I was working with MemoryStream but blob stream was taken as to avoid memory exception for large size blobs
 

Skydiver

Staff member
Joined
Apr 6, 2019
Messages
5,667
Location
Chesapeake, VA
Programming Experience
10+
Yes, and you were also the stream reader with the file stream and memory streams? If so, why did it use to work?
 

Latest posts

Top Bottom