In this article I built a feature addition version that can be exported from Discover tab of Kibana, but honestly it took quite a lot of trouble.
en-designetwork.hatenablog.com
Over time, there seems to be someone who created the same function as a patch, so I tried using it.
- How to use the Kibana Export patch
- Deploy with Docker
- Outline of the export operation
- Summary - Patch Kibana 5 and export CSV from the Discover tab
How to use the Kibana Export patch
The usage method introduced by the creator is as follows. After confirming the contents, the patch was the contents of the src directory. Delete optimize after replacing it and recreate it at startup. (Automatically)
cd ${KIBANA_HOME} tar -zcvf optimize-backup.tar.gz rm -rf optimize wget https://github.com/fbaligand/kibana/releases/download/v5.4.0-csv-export/csv-export-patch.tar.gz tar -zxvf csv-export-patch.tar.gz rm -f csv-export-patch.tar.gz bin/kibana
Deploy with Docker
Kibana already runs and deploys the patched version of Kibana as a Docker container to avoid conflicts.
I made it possible to deploy with Docker-Compose. In addition, parameters related to Elasticseach are not designed to read from environment variables, and the config directory is mounted.
$ tree . ├── README.md ├── docker-compose.yml └── kibana_5.3 ├── Dockerfile └── config └── etc ├── kibana.log └── kibana.yml
The base image is simply created from CentOS. The Elastic Official feels that it is heavy startup with multiple functions such as X-Pack installation. Since the existing Elasticsearch is Ver.5.3.0, Kibana also has versions. Since patches are released for each version, use the corresponding version.
$ sudo vi ./kibana_5.3/Dockerfile FROM centos RUN rpm --import https://artifacts.elastic.co/GPG-KEY-elasticsearch RUN echo $'[kibana-5.x]\n\ name=Kibana repository for 5.x packages\n\ baseurl=https://artifacts.elastic.co/packages/5.x/yum\n\ gpgcheck=1\n\ gpgkey=https://artifacts.elastic.co/GPG-KEY-elasticsearch\n\ enabled=1\n\ autorefresh=1\n\ type=rpm-md' > /etc/yum.repos.d/elastic.repo RUN yum install -y kibana-5.3.0-1 && yum clean all WORKDIR /usr/share/kibana RUN curl -OL https://github.com/fbaligand/kibana/releases/download/v5.3.0-csv-export/csv-export-patch.tar.gz RUN tar -xzvf ./csv-export-patch.tar.gz RUN chown kibana:kibana -R ./src RUN rm -rf ./optimize/* CMD /usr/share/kibana/bin/kibana
The docker-compose file is fairly simple like this.
$ sudo vi ./docker-compose.yml version: '2.1' services: kibana: build: kibana_5.3 container_name: kibana_5.3_export ports: - 5605:5601 networks: - kibana_net volumes: - ./kibana_5.3/config/etc/:/etc/kibana/ networks: kibana_net:
Make the setting file separately and mount it at container startup. The logging setting is output to the same directory to make the mount directory one.
$ sudo vi ./kibana_5.3/config/etc/kibana.yml server.port: 5601 server.host: "0.0.0.0" server.name: "kibana.designet.local" elasticsearch.url: "http://192.168.1.81:9200" logging.dest: /etc/kibana/kibana.log
It takes some time to start up
At startup, it takes several minutes for optimize processing. In my environment Kibana came up in about 90 seconds as below. (State logged STDOUT)
$ sudo docker-compose up Starting kibana_5.3_export ... Starting kibana_5.3_export ... done Attaching to kibana_5.3_export kibana_5.3_export | {"type":"log","@timestamp":"2017-07-16T17:17:12Z","tags":["info","optimize"],"pid":1,"message":"Optimizing and caching bundles for kibana, timelion and status_page. This may take a few minutes"} kibana_5.3_export | {"type":"log","@timestamp":"2017-07-16T17:18:43Z","tags":["info","optimize"],"pid":1,"message":"Optimization of bundles for kibana, timelion and status_page complete in 90.28 seconds"} kibana_5.3_export | {"type":"log","@timestamp":"2017-07-16T17:18:43Z","tags":["status","plugin:kibana@5.3.0","info"],"pid":1,"state":"green","message":"Status changed from uninitialized to green - Ready","prevState":"uninitialized","prevMsg":"uninitialized"}
Outline of the export operation
With the patch-applied version, it is possible to export like this. You can also export by selecting column in the Discover tab, saving it and opening it . Otherwise only _ source will be displayed.
From here you can export on CSV as follows.
"@timestamp",action,"if_src","if_dst",proto,direction,"ip_src","ip_local","port_src",nat,"ip_dst","port_local","port_dst" "July 17th 2017, 00:40:00.000",Built,outside,management,tcp,outbound,"172.217.27.68"," - ",443," - ","192.168.1.104",,"53,574" "July 17th 2017, 00:40:00.000",Built,management,outside,tcp," - ","192.168.1.104"," - ","53,574",dynamic,"222.159.141.103",,"53,574" "July 17th 2017, 00:40:00.000",Built,management,outside,tcp," - ","192.168.1.104"," - ","53,575",dynamic,"222.159.141.103",,"53,575"
Summary - Patch Kibana 5 and export CSV from the Discover tab
Kibana 5 applied the patch and the search result can be exported as CSV from the Discover tab. Because I can add export function with simple operation, I evaluate it as a very useful patch.