WYSIWYG

http://kufli.blogspot.com
http://github.com/karthik20522

Friday, July 26, 2013

Spray.io Quick Resources

Spray Authentication Routing References
  • http://www.gtan.com/akka_doc/scala/routing.html
  • http://spray.io/documentation/spray-routing/
  • https://github.com/spray/spray/wiki/Authentication-Authorization
  • https://github.com/spray/spray/wiki/Configuration
  • https://github.com/spray/spray/wiki

Spray Test
  • http://spray.io/documentation/spray-testkit/
  • http://etorreborre.github.io/specs2/guide/org.specs2.guide.Runners.html#Via+SBT

Spray Logback/Loggin
  • http://doc.akka.io/docs/akka/2.1.0/scala/logging.html
  • http://tantrajnana.blogspot.com/2012/10/using-c3p0-with-logback.html
  • http://doc.akka.io/docs/akka/snapshot/java/logging.html

Labels:

Friday, July 12, 2013

Beers in my Belly - XIX


Bretagne Celtic

Kronenbourg

Labels:

Thursday, July 11, 2013

Swaggered Development

The whole notion of interface based development model is to separate the backend developers from frontend developers by exposing the operations and models. While models represent the entity fields and its properties like required status, max length and other validation characteristics etc, operations represents the restful endpoints or API operations for the client to consume.
Swagger is an API documentation service that help define the operations and models that are both machine readable and human readable. Swagger exposes json schema (draft3 representation) of operations and models for machine readability and a UI that represents the operations, request, responses, summary, error codes etc for human readability.

By exposing the endpoints information and request and response parameters, you have effectively provided an insight to your operations. On any development project consisting on front end developers and backend developers, its always a waiting game for front end devs as they are dependent on api and its documentation for them to consume and develop. By providing an insight to the operations would allow consuming developers to start immediately mocking the REST service responses.

Much said, let’s build a simple swagger doc using Swagger.NET and WebApi.Swagger.net and SwaggerUi can be downloaded as nuget packages.


Once these two packages are installed make sure to have the XML documentation enabled in the project properties of your solution.


To get your API endpoints swaggerified, add XML comments, remarks and parameters information to your Action XML description such as follows:
Basically upon compilation, a resource list file (json) is created for machine readability and swaggerUI provides a very neat API explorer and documentation interface for human readability similar to below.


In part 2 of Swaggered Development, I shall show how to generate class files from resource files and also validation.

Labels:

Tuesday, July 9, 2013

Log analysis using Logstash, ElasticSearch and Kibana

Introduction:
Logstash is a free tool for managing events and logs. It has three primary components, an Input module for collecting logs from various sources [http://logstash.net/docs/1.1.13/], a parsing module for tweaking and parsing data and finally a storage/output module to save or pass along the parsed data to other systems [http://logstash.net/docs/1.1.13/].
ElasticSearch is this awesome distributable, RESTful, free Lucene powered search engine/server. Unlike SOLR, ES is very simple to use and maintain and similar to SOLR, indexing is near realtime.
Kibana is a presentation layer that sits on top of Elasticsearch to analyze and make sense of logs that logstash throws into Elastic search; Kibana is a highly scalable interface for Logstash and ElasticSearch that allows you to efficiently search, graph, analyze and otherwise make sense of a mountain of logs.
Logstash + ElasticSearch + Kibana combination can be compared to open sourced Splunk but on a smaller scale.

Setup:
Logstash, is as easy as downloading [http://logstash.net/] the JAR file and setting up the input and ouput sources and running the java command. In this example, I will be monitoring a log file and writing it into Elasticsearch server for users to analyse the data using Kibana.


Elasticsearch: download the zip package from the site [http://www.elasticsearch.org/download/] and run the elasticsearch.bat file.
Note: make sure the JAVA_HOME is setup up right for the logstash and elasticsearch to work.



Kibana: download the kibana files from Github [https://github.com/elasticsearch/kibana] and either run it as a standalone app or make it part of ElasticSearch plugins. You can do this by copying the kibana files to the ElasticSearch plugins / sites directory.



*Open config.js in your favorite editor
*Set elasticsearch: 'http://localhost:9200', to your ElasticSearch server

Use case:
In general most of these log analyzer always talk about analyzing website traffic etc similar to the videos that Kibana has on their website. [http://kibana.org/about.html]. This is great but in real world logs and events are more than just website traffic such as information flow checkpoints, performance data etc.
In our case, lets assume we have some data that is being passed from one system to another and we are logging to a file. A simple representation of this information flow is as follows:



So basically there are 4 systems or states that the data is passed thru, Ingest, Digest, Process and Exit. At each of these systems, an event is logged to track the data flow or basically checkpoints. These events are logged in dataLog.log file as mentioned in the above logstash configuration file.

Once the logstash is up and running, logstash basically tails the files and copies the logged events to elastic search as JSON objects. Elasticsearch index;s all the fields and kibana is now ready to access the data. Following are some of the cases that can be analyzed using Kibana:

Show all data flowing thru the system

Filter by Id


Get All Error'd


Advanced Filter using Lucene Syntax



The above reporting/analysis are just a few examples that can be achieved using Kibana + Elasticsearch. With Kibana you can design your own custom dashboards with configurable panels that can be grouped by role. Charts and panels are fully interactive with features like drill down, range selection and customization. With using Elasticsearch, rapid data growth is as easy as adding more ES servers (in cluster).

Labels: ,