Trainings

Thursday, February 8, 2018

sending Log file / CSV data to Elasticsearch without header row - PART II

Lets take my previous example , i am having below sample csv data


stock,open,close,date

icici,240,350,05-02-2014

sbi,140,250,05-02-2014

infy,950,1150,05-02-2014

tcs,2400,3500,05-02-2014

and when using below logstash config file "header row" is going to Standard output( we can replace this with elasticsearch . for demonstration purpose using stdout plugin)

logstash.conf

input {
file {
        path => "C:/java-developper-softwares/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
filter{
csv {
columns => ["stock", "open", "close","date"]
         separator => ","
    }

#if [message] =~ /^stock/
#{
# drop {}
#}
}
output {
  stdout {
codec => rubydebug
   }
}
when running command "logstash -f logstash.conf" getting below output.



















Now our requirement that we don't want to send "header row" i.e. "stock,open,close,date"
since now we parse csv and sending reading each column separately so we can compare stock column value and if it is equal to stock string then we can drop the row.
here in my case i compare stock column however logically we can compare any column mentioned in our csv file.

So lets modify our logstash.conf to below
input {
file {
        path => "C:/java-developper-softwares/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
filter{
csv {
columns => ["stock", "open", "close","date"]
         separator => ","
    }

if ([stock] == "stock")
{
drop {}
}
}
output {
  stdout {
codec => rubydebug
   }
}