Trainings

Monday, May 21, 2018

Progressive Web Applications


Progressive Web Applications are mix of Native apps and mobile sites. i.e. Application(web sites mainly) running in browser are now capable of features provided by native apps to some extent.
these application now works on cross devices i.e. can run on Desktop , mobile etc. which saves efforts of making duplicate apps for ios as well as android.

these features consumed with the help of JAVA Script "Service Worker" so any existing web application progressively use them and provide engaging and fast user experience to its User.

Below are few features provided by Progressive Web Apps:

  1. It allows user to work with low quality or no internet connection.
  2. Native app like - provide features like desktop apps
  3. Install on Home Screen
  4. very fast start up time : Cache most used files like css and js and images on client side.
  5. offline ready : Make some features of the web app function offline with the help of Service worker which uses Caches and/or indexedDB
  6. push notification
  7. Background sync
  8. client side caching which saves network bandwidth
  9. downloadable & installable
  10. Access device camera
  11. get user location


Monday, May 14, 2018

Angular 2 vs 4 vs 5


Comparison of Angular 2 ,4 and 5. Angular 6 just released...will be adding

Angular 2
Angular 4
Angular 5
 - Improved Databinding
 - based entirely on components
 - Support for TypeScript -  you get access to all the advantages, libraries, and technologies associated with TypeScript.
 - Templates – template is now ng-template. you should use the ng-template
 - In Angular 2.0, we use an email validator, using pattern option. but in Angular 4.0, there is a validator to validate an email.
 - it’s possible to render Angular applications outside of the browser.
 - *ngComponentOutlet directive enables the web developer to develop dynamic components in a declarative way.
 - Animations Animations now have their own package i.e. @angular/platform-browser/animations.
 - support for TypeScript 2.5 peviously typeScript 2.3.
 - Angular Universal ng generate universal                      
 - support for Angular CLI 1.6 building applications which take advantage of this new feature. Using @angular/service-worker can improve the loading performance of your applications.
 - support AppShell. App Shell uses the router to render your application.
ng generate app-shell
 - In Angular 5 you can give multiple names to your components while exporting,
 - introduced new pipes for numbers, dates and currencies .
 - ngModelChange is now emitted after the value gets updated.
 - form validation can be fired on submit now instead of default on blur.
 - new life cycle events being added to the router and those are ActivationStart, ActivationEnd,  ChildActivationStart, ChildActivationEnd, GuardsCheckStart, GuardsCheckEnd, ResolveStart and ResolveEnd.
 - In earlier versions of Angular, we were depending on i18n whenever we wanted to support internationalization in our application.  In Angular 5 now no need to depend on i18n, it provides a new date, number, and currency pipes which increases the internationalization across all the browsers and eliminates the need of i18n polyfills.
 - Before Angular 4.3 versions, we were using @angular/HTTP module for all kinds of HTTP requests. Now, in Angular 5, @angular/http module has been deprecated and introduced new module called HttpClientModule which comes under @angular/common/http package
In Angular 5, multiple names support for both directives and components

Thursday, February 8, 2018

sending Log file / CSV data to Elasticsearch without header row - PART II

Lets take my previous example , i am having below sample csv data


stock,open,close,date

icici,240,350,05-02-2014

sbi,140,250,05-02-2014

infy,950,1150,05-02-2014

tcs,2400,3500,05-02-2014

and when using below logstash config file "header row" is going to Standard output( we can replace this with elasticsearch . for demonstration purpose using stdout plugin)

logstash.conf

input {
file {
        path => "C:/java-developper-softwares/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
filter{
csv {
columns => ["stock", "open", "close","date"]
         separator => ","
    }

#if [message] =~ /^stock/
#{
# drop {}
#}
}
output {
  stdout {
codec => rubydebug
   }
}
when running command "logstash -f logstash.conf" getting below output.



















Now our requirement that we don't want to send "header row" i.e. "stock,open,close,date"
since now we parse csv and sending reading each column separately so we can compare stock column value and if it is equal to stock string then we can drop the row.
here in my case i compare stock column however logically we can compare any column mentioned in our csv file.

So lets modify our logstash.conf to below
input {
file {
        path => "C:/java-developper-softwares/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
filter{
csv {
columns => ["stock", "open", "close","date"]
         separator => ","
    }

if ([stock] == "stock")
{
drop {}
}
}
output {
  stdout {
codec => rubydebug
   }
}


Wednesday, February 7, 2018

Change your credential in eclipse git (EGIT) if your password expires/change

We all know that password expires after a certain time period. So how to set the new password for your Eclipse git client. Here are the steps given below:

open git repository by clicking on open perspective button in eclipse and select git


now expand your Remotes >> origins >> 
and right click on the green node and select "change credentials".














and now try to push the data to stream and your authorization issue will get resolved.


Tuesday, February 6, 2018

sending Log file / CSV data to Elasticsearch without header row - PART I

Lets take an example , i am having below sample csv data

stock,open,close,date
icici,240,350,05-02-2014
sbi,140,250,05-02-2014
infy,950,1150,05-02-2014
tcs,2400,3500,05-02-2014

and when using below logstash config file "header row" is going to Standard output( we can replace this with elasticsearch . for demonstration purpose using stdout plugin)

logstash.conf

input {
file {
        path => "C:/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
output {
  stdout {
codec => rubydebug
   }
}

when running command "logstash -f logstash.conf" getting below output.

Now our requirement that we don't want to send "header row" i.e. "stock,open,close,date"
So lets modify our logstash.conf to below
input {
file {
        path => "C:/prashant/ELK/data/stock.csv"
        start_position => "beginning"
sincedb_path => "/dev/null"
    }
}
filter{
if [message] =~ /^stock/
{
drop {}
}
}
output {
  stdout {
codec => rubydebug
   }
}

so here we are using filter plugin where we are checking that if message contains "stock" string then drop that row. and boila see below output

So crux of this post is use filter plugin and check any of the column value and drop that row.
filter{
if [message] =~ /^stock/
{
drop {}
}
}

In my next post , i will walk through if i read this file as a csv i.e. comma delimited list of column in that case how we will ignore first row.

Monday, February 5, 2018

Use multiple logstash config file at a time on single command line or as a service

Problem Statement- having multiple logstash config file(As there is differet data configured in each file) for posting data from different machines in cluster which requires to open as many command line instances as number of config files. So is it possible to run all the config files from single instance or anything similar.

Answer: Either put all files in a directory and run Logstash with -f path/to/directory
or use multiple -f options that each point to one of the files i.e.
Command- logstash -f Sample1.conf -f Sample2.conf

Note: Keep in mind that Logstash has a single event pipeline (i.e. internally all configuration files concatenated and treated as a single big file.) and apply to all filters and outputs events unless you use conditionals to select how they apply..

need to use conditionals to select which filters and outputs to apply to which events.

filter {

if [type] == "event" {

# do stuff

}

else if [type] == "data" {

# do other stuff

}

}

OR

input {
file {
path => "BatchData\Batch_Raw_Data.csv"
tags => [ "batchdata" ]
start_position => "beginning"
}
}
output {
if "batchdata" in [tags]{
elasticsearch {
action => "index"
index => "IndexName"
}
}}

If you really need multiple event pipelines that can be stopped and restarted individually you will indeed have to run multiple instances of Logstash.

Sunday, October 9, 2016