Skip to content

Latest commit

 

History

History
97 lines (76 loc) · 4.15 KB

README.md

File metadata and controls

97 lines (76 loc) · 4.15 KB

amazon-mws-elastic

Upload Amazon sellercentral reports into Elastic (ELK stack). Currently supports the following reports

Installation

Install Elasticsearch

https://www.elastic.co/downloads/elasticsearch

Install Logstash

https://www.elastic.co/downloads/logstash

Install Kibana

https://www.elastic.co/downloads/kibana

Configure Logstash

Install Translate plugin

To install translate plugin, stop logstash service and run the following command on your logstash host

/usr/share/logstash/bin/logstash-plugin install logstash-filter-translate

Elasticsearch template to map geo_point to certain field that contains geo locations

To be able to map byers address to geo_point in Kibana world map we need to translate zip codes to geo location. This is done in the filter section of the Logstash config file like this:

mutate {
        add_field => ["latitude","%{[translation[0]}"]
        add_field => ["longitude","%{[translation[1]}"]
      }
      mutate {
        convert => { "longitude" => "float" }
        convert => { "latitude" => "float" }
      }
      mutate {
        rename => {
          "longitude" => "[location][lon]"
          "latitude" => "[location][lat]"
}

The new location field type then is mapped as geo_point with template inside Elasticsearch. For logstash to be able to upload this template into Elasticsearch, the template path must be defined in the output settings of the logstash config. This template is uploaded into Elasticsearch when Logstash starts up or restarts.

"properties": {
        "@version": { "type": "string", "index": "not_analyzed" },
        "location": { "type": "geo_point" }
}

Manually import template into Elasticsearch

To manually import this template into Elasticsearch, do the following:

curl -XPUT 'http://localhost:9200/_template/mws-collector-reports-fulfillment' -d@/etc/logstash/templates/mws-collector-reports-fulfillment.json

Upload csv data into logstash

Everything should be ready for the Amazon sellercentral csv data sets. First generate Fulfilled shipments data report in sellercentral and then download the report. Copy this file to the path defined in the input settings in the Logstash config file.

input {
  file {
    path => "/tmp/fulfillment/*"

Now, tail the logstash logs and see if the data is flowing correctly through Logstash.

tail -f /var/log/logstash/logstash-plain.log

[2017-03-01T09:38:50,385][DEBUG][logstash.pipeline        ] output received {"event"=>{"bill-country"=>nil, "amazon-order-item-id"=>"45646456", "type"=>"mws-collector-reports-fulfillment", "tracking-number"=>"xxxxxxxxxxxxx", "path"=>"/tmp/fulfillment/AMAZON_FULFILLED_SHIPMENTS_DATA.csv", "amazon-order-id"=>"xxxxxxx", "item-promotion-discount"=>0.0, "estimated-arrival-date"=>"2017-02-07T04:00:00+00:00", "ship-postal-code"=>49404,

Visualize Amazon reports from Sellercentral in Kibana (Elastic v5.2)

Select Management->Index patterns->Add New

2017-03-01 09_36_11-kibana

Make sure that location is defined as type:geo_point (if not, you need to upload the elasticsearch template)

2017-03-01 09_45_58-mws-collector-reports-fulfillment-_ - kibana The template should look like this inside Elasticsearch 2017-03-01 09_44_06-console - kibana

Next create the the map with Visualize->Tile Map Then select your index with location as the geo_point field 2017-03-01 09_50_16-kibana

The map should show the data like this

2017-03-01 09_51_15-kibana