More Related Content Similar to AWS Security JAWS 経済的にハニーポットのログ分析をするためのベストプラクティス? (20) AWS Security JAWS 経済的にハニーポットのログ分析をするためのベストプラクティス?2. 自己紹介
l 前原 応光(まえはら まさみつ)
l Future Architect, Inc.
l Technology Innovation Group
l エンプラでAWSとかゴニョゴニョやってます
l ゆるふわエンジニア
@micci184
7. Elastic Stack
l Logstash: 取り込んで加工してストア
l Beats: データシッパー(色々とよしなにやってくれる)
l Elasticsearch: 検索や分析に使われてる
l Kibana: ビジュアライズやダッシュボードが作れる
25. l dionae02: architect-tech.com →
PASTEBIN
l dionae01: Global IP address
l dionae03: Global IP address
l dionae04: architect-tech.net
→
PASTEBIN
PASTEBINに公開する・しない
29. Cowrie
California
Cowrie Cowrie Cowrie Cowrie Cowrie
Cloudwatch Logs
Canada
Cloudwatch Logs
Ireland
Cloudwatch Logs
Saopaulo
Cloudwatch Logs
Singapore
Cloudwatch Logs
Tokyo
Cloudwatch Logs
Logstash
Virginia
ログ集約方法
l CloudWatch LogsにCowrieのログをアップロード
l LogstashからCloudWatch LogsのLog Groupからログを取得
31. 取得したいログ
l CloudWatch LogsにあるCowrie.json
l CloudWatch LogsにあるVPCFlow Logs
l CloudWatch LogsにあるRoute53のログ
l S3にあるMalwarescanLogs
Grok : VpcFlow Logs & Route53
JsonFilter : Cowrie.json & Scan Logs
Logstashのフィルタ対応
32. input {
cloudwatch_logs {
region => "us-east-1"
log_group => [ "/aws/route53/architect-tech.com" ]
sincedb_path => "/var/lib/logstash/sincedb_architect_tech_com"
}
}
filter {
grok{
patterns_dir => [ "/etc/logstash/patterns/vpcflowlogs_patterns" ]
match => { "message" => "%{VPCFLOWLOG}"}
}
date {
match => ["start_time", "UNIX"]
target => "@timestamp"
}
geoip {
source => "src_ip"
target => "src_geoip"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => ”vpcflow-logs-%{+YYYYMMdd}"
}
}
# VPC_Flow_Logs
VPCFLOWLOG %{NUMBER:version} %{NUMBER:account_id} %{NOTSPACE:interfac
e_id} %{IP:src_ip} %{IP:dst_ip} %{POSINT:src_port} %{POSINT:dst_port} %{NOTSP
ACE:protocol_id} %{NOTSPACE:packets} %{NOTSPACE:bytes} %{NUMBER:start_ti
me} %{NUMBER:end_time} %{NOTSPACE:action} %{NOTSPACE:log_status}
★Pattern FIle
★Conf FIle
VpCFlow Logs
l InputCloudWatch Logs
Pluginのインストールが必須
l GrokPatternは外だし
l GrokFillterから呼び出す
l UNIXタイムなので、Date
フィルタで定義する
l GeoIPで地理情報を取得
l OutputでIndexを定義
l あらかじめIndexTemplateを
つくっておくこと
(作り方は割愛)
33. input {
cloudwatch_logs {
region => "us-west-1"
log_group => [ "california_cowrie" ]
sincedb_path => "/var/lib/logstash/sincedb_vpcflowlogs_california"
}
}
filter {
grok {
patterns_dir => [ "/etc/logstash/patterns/route53_patterns" ]
match => { "message" => "%{ROUTE53LOG}" }
}
date {
match => [ "date", "ISO8601" ]
target => "@timestamp"
}
geoip {
source => "resolver_ip"
target => "src_geoip"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => ”Route53-logs-%{+YYYYMMdd}"
}
}
# Route53
ROUTE53LOG %{NOTSPACE:version}¥s%{TIMESTAMP_ISO8601:date} %{NOTSPACE:
host_id}¥s%{URIPROTO:query_name}¥s%{WORD:query_type}¥s%{WORD:respon
se_code}¥s%{WORD:protocol}¥s%{NOTSPACE:edge}¥s%{IP:resolver_ip}¥s(%{IP:e
dns_client_subnet}/%{POSINT:edns_cidr}|-)
★Pattern FIle
★Conf FIle
Route53
l 先ほどのVPCFlow Logsと
要領は一緒
l timestampは、今回ISO8601
l GeoIPで地理情報を取得
l OutputでIndexを定義
l ちなみに
GrokPatternを作成したい
そんな方はログフォーマットを
ちゃんと読むべしー
34. input {
s3 {
bucket => "cowrie-log"
region => "us-east-1"
prefix => "california/"
interval => "30"
sincedb_path => "/var/lib/logstash/sincedb_cowrie_json_california"
codec => json
}
}
filter {
json {
source => "message"
}
date {
match => [ "timestamp", "ISO8601" ]
target => "@timestamp"
}
geoip {
source => "src_ip"
target => "src_geoip”
}
geoip {
source => "dst_ip"
target => "dst_geoip"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "cowrie-json-logs-%{+YYYYMMdd}"
}
}
★Conf FIle
Cowrie.json
l S3InputPluginのインストール
が必須
l JsonなどでJson Filterで
読み込む
l SourceIP & DistinaetionIP
共に地理情報を取得
l インデックステンプレートを
作成するときにGeoIPなどの
マッピングすること
35. input {
s3 {
bucket => "cowrie-log"
region => "us-east-1"
prefix => "california/"
interval => "30"
sincedb_path => "/var/lib/logstash/sincedb_cowrie_json_california"
codec => json
}
}
filter {
json {
source => "message"
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "vt-logs-%{+YYYYMMdd}"
}
}
★Conf FIle
Viurus Total
l Json Filterかけるだけ!
l 以上!!
l Input CloudWatch logs
l S3 Input Plugin
Install Plugin
★Install Input CloudWatch logs
$ cd /usr/share/logstash/
$ bin/logstash-plugin install logstash-input-cloudwatch_logs
★Install S3 Input Plugin
$ cd /usr/share/logstash/
$ bin/logstash-plugin install logstash-input-s3