mihari 5.4.4 → 5.4.6
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/README.md +3 -25
- data/docs/alternatives.md +5 -0
- data/docs/analyzers/binaryedge.md +26 -0
- data/docs/analyzers/censys.md +31 -0
- data/docs/analyzers/circl.md +37 -0
- data/docs/analyzers/crtsh.md +26 -0
- data/docs/analyzers/dnstwister.md +25 -0
- data/docs/analyzers/feed.md +73 -0
- data/docs/analyzers/greynoise.md +26 -0
- data/docs/analyzers/hunterhow.md +33 -0
- data/docs/analyzers/index.md +79 -0
- data/docs/analyzers/onyphe.md +26 -0
- data/docs/analyzers/otx.md +28 -0
- data/docs/analyzers/passivetotal.md +48 -0
- data/docs/analyzers/pulsedive.md +28 -0
- data/docs/analyzers/securitytrails.md +37 -0
- data/docs/analyzers/shodan.md +26 -0
- data/docs/analyzers/urlscan.md +28 -0
- data/docs/analyzers/virustotal.md +39 -0
- data/docs/analyzers/virustotal_intelligence.md +29 -0
- data/docs/analyzers/zoomeye.md +33 -0
- data/docs/configuration.md +35 -0
- data/docs/emitters/database.md +22 -0
- data/docs/emitters/hive.md +26 -0
- data/docs/emitters/index.md +7 -0
- data/docs/emitters/misp.md +21 -0
- data/docs/emitters/slack.md +26 -0
- data/docs/emitters/webhook.md +63 -0
- data/docs/enrichers/google_public_dns.md +19 -0
- data/docs/enrichers/index.md +6 -0
- data/docs/enrichers/ipinfo.md +19 -0
- data/docs/enrichers/shodan.md +22 -0
- data/docs/enrichers/whois.md +17 -0
- data/docs/github_actions.md +43 -0
- data/docs/index.md +13 -0
- data/docs/installation.md +31 -0
- data/docs/requirements.md +20 -0
- data/docs/rule.md +171 -0
- data/docs/tags.md +3 -0
- data/docs/usage.md +100 -0
- data/frontend/package-lock.json +232 -229
- data/frontend/package.json +7 -7
- data/lib/mihari/analyzers/feed.rb +7 -7
- data/lib/mihari/version.rb +1 -1
- data/lib/mihari/web/public/assets/{index-ef33a6cd.js → index-0a5a47bf.js} +43 -41
- data/lib/mihari/web/public/index.html +1 -1
- data/mihari.gemspec +1 -1
- data/mkdocs.yml +35 -0
- data/requirements.txt +2 -0
- metadata +45 -4
@@ -0,0 +1,29 @@
|
|
1
|
+
---
|
2
|
+
tags:
|
3
|
+
- IP address
|
4
|
+
- Domain
|
5
|
+
- URL
|
6
|
+
- Hash
|
7
|
+
---
|
8
|
+
|
9
|
+
# VirusTotal Intelligence
|
10
|
+
|
11
|
+
- [https://www.virustotal.com](https://www.virustotal.com/gui/home/search)
|
12
|
+
|
13
|
+
This analyzer uses VirusTotal Intelligence API. Pagination is supported.
|
14
|
+
|
15
|
+
```yaml
|
16
|
+
analyzer: virustotal_intelligence
|
17
|
+
query: ...
|
18
|
+
api_key: ...
|
19
|
+
```
|
20
|
+
|
21
|
+
## Components
|
22
|
+
|
23
|
+
### Query
|
24
|
+
|
25
|
+
`query` is a search query.
|
26
|
+
|
27
|
+
### API Key
|
28
|
+
|
29
|
+
`api_key` is an API key. Optional. Defaults to `ENV[”VIRUSTOTAL_API_KEY"]`.
|
@@ -0,0 +1,33 @@
|
|
1
|
+
# ZoomEye
|
2
|
+
|
3
|
+
- [https://zoomeye.org/](https://zoomeye.org/)
|
4
|
+
|
5
|
+
This analyzer uses ZoomEye API v3. Pagination is supported.
|
6
|
+
|
7
|
+
An API endpoint to use is changed based on a `type` option.
|
8
|
+
|
9
|
+
| Type | API endpoint | Artifact type |
|
10
|
+
| ---- | -------------- | ------------- |
|
11
|
+
| web | `/web/search` | IP address |
|
12
|
+
| host | `/host/search` | IP address |
|
13
|
+
|
14
|
+
```yaml
|
15
|
+
analyzer: zoomeye
|
16
|
+
query: ...
|
17
|
+
type: ...
|
18
|
+
api_key: ...
|
19
|
+
```
|
20
|
+
|
21
|
+
## Components
|
22
|
+
|
23
|
+
### Query
|
24
|
+
|
25
|
+
`query` is a search query.
|
26
|
+
|
27
|
+
### Type
|
28
|
+
|
29
|
+
`type` determines a search type. `web` or `host`.
|
30
|
+
|
31
|
+
### API Key
|
32
|
+
|
33
|
+
`api_key` is an API key. Optional. Defaults to `ENV[”ZOOMEYE_API_KEY"]`.
|
@@ -0,0 +1,35 @@
|
|
1
|
+
# Configuration
|
2
|
+
|
3
|
+
Configuration can be done via environment variables.
|
4
|
+
|
5
|
+
| Environmental Variable | Description | Default |
|
6
|
+
| ---------------------- | ------------------------------- | ---------------------- |
|
7
|
+
| DATABASE_URL | Database URL | `sqlite3:///mihari.db` |
|
8
|
+
| BINARYEDGE_API_KEY | BinaryEdge API key | |
|
9
|
+
| CENSYS_ID | Censys API ID | |
|
10
|
+
| CENSYS_SECRET | Censys secret | |
|
11
|
+
| CIRCL_PASSIVE_PASSWORD | CIRCL passive DNS/SSL password | |
|
12
|
+
| CIRCL_PASSIVE_USERNAME | CIRCL passive DNS/SSL username, | |
|
13
|
+
| IPINFO_API_KEY | IPInfo API key (token) | |
|
14
|
+
| MISP_URL | MISP URL | |
|
15
|
+
| MISP_API_KEY | MISP API key | |
|
16
|
+
| ONYPHE_API_KEY | Onyphe API key | |
|
17
|
+
| OTX_API_KEY | OTX API key | |
|
18
|
+
| PASSIVETOTAL_API_KEY | PassiveTotal API key | |
|
19
|
+
| PASSIVETOTAL_USERNAME | PassiveTotal username | |
|
20
|
+
| PULSEDIVE_API_KEY | Pulsedive API key | |
|
21
|
+
| SECURITYTRAILS_API_KEY | SecurityTrails API key | |
|
22
|
+
| SHODAN_API_KEY | Shodan API key | |
|
23
|
+
| SLACK_CHANNEL | Slack channel name | `#general` |
|
24
|
+
| SLACK_WEBHOOK_URL | Slack Webhook URL | |
|
25
|
+
| THEHIVE_URL | TheHive URL, | |
|
26
|
+
| THEHIVE_API_KEY | TheHive API key, | |
|
27
|
+
| URLSCAN_API_KEY | urlscan.io API key, | |
|
28
|
+
| VIRUSTOTAL_API_KEY | VirusTotal API key | |
|
29
|
+
| ZOOMEYE_API_KEY | ZoomEye API key | |
|
30
|
+
| SENTRY_DSN | Sentry DSN | |
|
31
|
+
| RETRY_INTERVAL | Retry interval | 5 |
|
32
|
+
| RETRY_TIMES | Retry times | 3 |
|
33
|
+
| PAGINATION_LIMIT | Pagination limit | 100 |
|
34
|
+
|
35
|
+
Or you can set values through `.env` file. Values in `.env` file will be automatically loaded.
|
@@ -0,0 +1,22 @@
|
|
1
|
+
# Database
|
2
|
+
|
3
|
+
This emitter stores data in a database. This emitter uses SQLite3 by default but you can change to use MySQL or PostgreSQL. The database is a primary database of Mihari. Each data generated by Mihari is stored in the database. You can view the data via the built-in web app.
|
4
|
+
|
5
|
+
Mihari loads a database URL via environment variable `DATABASE_URL`. Defaults to `sqlite3:///mihari.db"` (SQLite3).
|
6
|
+
|
7
|
+
If you want to use MySQL or PostgreSQL, please set a database URL for that.
|
8
|
+
|
9
|
+
- MySQL: `mysql2://username:password@host:3306/database` (+ `gem install mysql2`)
|
10
|
+
- PostgreSQL: `postgres://username:password@host:5432/database` (+ `gem install pg`)
|
11
|
+
|
12
|
+
```yaml
|
13
|
+
emitter: database
|
14
|
+
```
|
15
|
+
|
16
|
+
!!! note
|
17
|
+
|
18
|
+
You have to initialize the database by `mihari db migrate`.
|
19
|
+
|
20
|
+
## ER Diagram
|
21
|
+
|
22
|
+
![](https://imgur.com/krhoSgh.png)
|
@@ -0,0 +1,26 @@
|
|
1
|
+
# TheHive
|
2
|
+
|
3
|
+
- [https://thehive-project.org/](https://thehive-project.org/)
|
4
|
+
|
5
|
+
This emitter creates an alert on TheHive. TheHive v4 & v5 are supported.
|
6
|
+
|
7
|
+
```yaml
|
8
|
+
emitter: the_hive
|
9
|
+
url: ...
|
10
|
+
api_key: ...
|
11
|
+
api_version: ...
|
12
|
+
```
|
13
|
+
|
14
|
+
## Components
|
15
|
+
|
16
|
+
### URL
|
17
|
+
|
18
|
+
`url` is a TheHive URL. Optional. Defaults to `ENV[”THEHIVE_URL”]`.
|
19
|
+
|
20
|
+
### API Key
|
21
|
+
|
22
|
+
`api_key` is an API key. Optional. Defaults to `ENV[”THEHIVE_API_KEY”]`.
|
23
|
+
|
24
|
+
### API Version
|
25
|
+
|
26
|
+
`api_version` is a version of The Hive API. Optional. Defaults to `ENV[”THEHIVE_API_VERSION”]`.
|
@@ -0,0 +1,21 @@
|
|
1
|
+
# MISP
|
2
|
+
|
3
|
+
- [https://www.misp-project.org/](https://www.misp-project.org/)
|
4
|
+
|
5
|
+
This emitter creates an event on MISP based on an alert.
|
6
|
+
|
7
|
+
```yaml
|
8
|
+
emitter: misp
|
9
|
+
url: ...
|
10
|
+
api_key: ...
|
11
|
+
```
|
12
|
+
|
13
|
+
## Components
|
14
|
+
|
15
|
+
### URL
|
16
|
+
|
17
|
+
`url` is a MISP URL. Optional. Defaults to `ENV[MISP_URL]`.
|
18
|
+
|
19
|
+
### API Key
|
20
|
+
|
21
|
+
`api_key` is an API key. Optional. Defaults to `ENV[”MISP_API_KEY”]`.
|
@@ -0,0 +1,26 @@
|
|
1
|
+
# Slack
|
2
|
+
|
3
|
+
- [https://slack.com/](https://slack.com/intl/ja-jp/)
|
4
|
+
|
5
|
+
This emitter post a message to Slack via incoming webhook.
|
6
|
+
|
7
|
+
```yaml
|
8
|
+
emitter: slack
|
9
|
+
webhook_url: ...
|
10
|
+
channel: ...
|
11
|
+
```
|
12
|
+
|
13
|
+
| Name | Type | Required? | Default | Desc. |
|
14
|
+
| ----------- | ------ | --------- | ------------------------------- | ----------------- |
|
15
|
+
| webhook_url | String | No | ENV[SLACK_WEBHOOK_URL] | Slack webhook URL |
|
16
|
+
| channel | String | No | ENV[SLACK_CHANNEL] / `#general` | Slack channel |
|
17
|
+
|
18
|
+
## Components
|
19
|
+
|
20
|
+
### Webhook URL
|
21
|
+
|
22
|
+
`url` is a Slack's incoming webhook URL. Optional. Defaults to `ENV[SLACK_WEBHOOK_URL]`.
|
23
|
+
|
24
|
+
### API Key
|
25
|
+
|
26
|
+
`channel` is a Slack channel to sent a message. Optional. Defaults to `ENV[SLACK_CHANNEL]` or `#general`.
|
@@ -0,0 +1,63 @@
|
|
1
|
+
# Webhook
|
2
|
+
|
3
|
+
This emitter creates an HTTP request payload based on the specified conditions.
|
4
|
+
|
5
|
+
```yaml
|
6
|
+
emitter: webhook
|
7
|
+
url: ...
|
8
|
+
method: ...
|
9
|
+
headers: ...
|
10
|
+
template: ...
|
11
|
+
```
|
12
|
+
|
13
|
+
## Components
|
14
|
+
|
15
|
+
### URL
|
16
|
+
|
17
|
+
`url` is a webhook URL.
|
18
|
+
|
19
|
+
### Method
|
20
|
+
|
21
|
+
`method` is an HTTP method. Optional. Defaults to `POST`.
|
22
|
+
|
23
|
+
### Headers
|
24
|
+
|
25
|
+
`headers` (hash) is HTTP headers. Optional.
|
26
|
+
|
27
|
+
### Template
|
28
|
+
|
29
|
+
`template` is an [ERB](https://github.com/ruby/erb) template to customize the payload to sent. A template should generate a valid JSON.
|
30
|
+
|
31
|
+
You can use the following parameters inside an ERB template.
|
32
|
+
|
33
|
+
- `rule`: a rule
|
34
|
+
- `artifacts`: a list of artifacts
|
35
|
+
|
36
|
+
## Examples
|
37
|
+
|
38
|
+
### ThreatFox
|
39
|
+
|
40
|
+
```yaml
|
41
|
+
- emitter: webhook
|
42
|
+
url: https://threatfox-api.abuse.ch/api/v1/
|
43
|
+
headers:
|
44
|
+
api-key: YOUR_API_KEY
|
45
|
+
template: threatfox.erb
|
46
|
+
```
|
47
|
+
|
48
|
+
```ruby
|
49
|
+
{
|
50
|
+
"query": "submit_ioc",
|
51
|
+
"threat_type": "payload_delivery",
|
52
|
+
"ioc_type": "ip:port",
|
53
|
+
"malware": "foobar",
|
54
|
+
"confidence_level": 100,
|
55
|
+
"anonymous": 0,
|
56
|
+
"iocs": [
|
57
|
+
<% @artifacts.select { |artifact| artifact.data_type == "ip" }.each_with_index do |artifact, idx| %>
|
58
|
+
"<%= artifact.data %>:80"
|
59
|
+
<%= ',' if idx < (@artifacts.length - 1) %>
|
60
|
+
<% end %>
|
61
|
+
]
|
62
|
+
}
|
63
|
+
```
|
@@ -0,0 +1,19 @@
|
|
1
|
+
---
|
2
|
+
tags:
|
3
|
+
- DNS record
|
4
|
+
---
|
5
|
+
|
6
|
+
# Google Public DNS
|
7
|
+
|
8
|
+
- [https://developers.google.com/speed/public-dns](https://developers.google.com/speed/public-dns)
|
9
|
+
|
10
|
+
This enricher uses Google Public DNS to enrich an URL and domain artifact.
|
11
|
+
|
12
|
+
```yaml
|
13
|
+
enricher: google_public_dns
|
14
|
+
```
|
15
|
+
|
16
|
+
## Supported Artifacts
|
17
|
+
|
18
|
+
- URL
|
19
|
+
- Domain
|
@@ -0,0 +1,19 @@
|
|
1
|
+
---
|
2
|
+
tags:
|
3
|
+
- Autonomous system
|
4
|
+
- Geolocation
|
5
|
+
---
|
6
|
+
|
7
|
+
# ipinfo.io
|
8
|
+
|
9
|
+
- [https://ipinfo.io/](https://ipinfo.io/)
|
10
|
+
|
11
|
+
This enricher uses ipinfo.io API to enrich an IP artifact.
|
12
|
+
|
13
|
+
```yaml
|
14
|
+
enricher: ipinfo
|
15
|
+
```
|
16
|
+
|
17
|
+
## Supported Artifacts
|
18
|
+
|
19
|
+
- IP address
|
@@ -0,0 +1,22 @@
|
|
1
|
+
---
|
2
|
+
tags:
|
3
|
+
- Port
|
4
|
+
- CPE
|
5
|
+
- DNS record
|
6
|
+
---
|
7
|
+
|
8
|
+
# Shodan
|
9
|
+
|
10
|
+
- [https://www.shodan.io/](https://www.shodan.io/dashboard)
|
11
|
+
|
12
|
+
This enricher uses Shodan InternetDB API to enrich an artifact.
|
13
|
+
|
14
|
+
[https://internetdb.shodan.io/](https://internetdb.shodan.io/)
|
15
|
+
|
16
|
+
```yaml
|
17
|
+
enricher: shodan
|
18
|
+
```
|
19
|
+
|
20
|
+
## Supported Artifacts
|
21
|
+
|
22
|
+
- IP address
|
@@ -0,0 +1,43 @@
|
|
1
|
+
# GitHub Actions
|
2
|
+
|
3
|
+
GitHub Actions is a good way to run Mihari searches continuously.
|
4
|
+
|
5
|
+
The following is an example of a GitHub Actions workflow to run Mihari.
|
6
|
+
|
7
|
+
```yaml
|
8
|
+
name: Mihari searches
|
9
|
+
|
10
|
+
on:
|
11
|
+
workflow_dispatch:
|
12
|
+
|
13
|
+
jobs:
|
14
|
+
build:
|
15
|
+
runs-on: ubuntu-latest
|
16
|
+
steps:
|
17
|
+
- uses: actions/checkout@v4
|
18
|
+
- name: Install dependencies
|
19
|
+
run: sudo apt-get -yqq install sqlite3 libsqlite3-dev
|
20
|
+
- name: Set up Ruby 3.2
|
21
|
+
uses: ruby/setup-ruby@v1
|
22
|
+
with:
|
23
|
+
ruby-version: "3.2"
|
24
|
+
bundler-cache: true
|
25
|
+
- name: Run Mihari
|
26
|
+
run: |
|
27
|
+
mihari search /path/to/rule.yml
|
28
|
+
```
|
29
|
+
|
30
|
+
!!! tip
|
31
|
+
|
32
|
+
You need to install `libpq-dev` for PostgreSQL, `libmysqlclient-dev` for MySQL.
|
33
|
+
|
34
|
+
This example assumes that you have `Gemfile` in your repository.
|
35
|
+
|
36
|
+
```ruby
|
37
|
+
source "https://rubygems.org"
|
38
|
+
|
39
|
+
gem "pg" # if you use PostgresSQL
|
40
|
+
gem "mysql2" # if you use MySQL
|
41
|
+
|
42
|
+
gem "mihari"
|
43
|
+
```
|
data/docs/index.md
ADDED
@@ -0,0 +1,13 @@
|
|
1
|
+
# Mihari
|
2
|
+
|
3
|
+
A query aggregator for OSINT based threat hunting.
|
4
|
+
|
5
|
+
Mihari can aggregate multiple searches across multiple services in a single rule & persist findings in a database.
|
6
|
+
|
7
|
+
- [Requirements](./requirements.md)
|
8
|
+
- [Installation](./installation.md)
|
9
|
+
- [How to Write a Rule](./rule.md)
|
10
|
+
- [Usage](./usage.md)
|
11
|
+
- [Configuration](./configuration.md)
|
12
|
+
- [GitHub Actions](./github_actions.md)
|
13
|
+
- [Alternatives](./alternatives.md)
|
@@ -0,0 +1,31 @@
|
|
1
|
+
# Installation
|
2
|
+
|
3
|
+
## Ruby Gem
|
4
|
+
|
5
|
+
Mihari is packaged as a Ruby Gem.
|
6
|
+
|
7
|
+
```bash
|
8
|
+
gem install mihari
|
9
|
+
```
|
10
|
+
|
11
|
+
Mihari uses SQLite3 as a primary database by default. Thus a gem for SQLite (`sqlite3`) is installed by default.
|
12
|
+
|
13
|
+
If you want to use MySQL or PostgreSQL instead of SQLite3, please install a gem for that by yourself.
|
14
|
+
|
15
|
+
**MySQL**
|
16
|
+
|
17
|
+
```bash
|
18
|
+
gem install mysql2
|
19
|
+
```
|
20
|
+
|
21
|
+
**PostgreSQL**
|
22
|
+
|
23
|
+
```bash
|
24
|
+
gem install pg
|
25
|
+
```
|
26
|
+
|
27
|
+
# Docker
|
28
|
+
|
29
|
+
You can built the Docker image by yourself.
|
30
|
+
|
31
|
+
`Dockerfile` is available at [https://github.com/ninoseki/mihari/tree/master/docker](https://github.com/ninoseki/mihari/tree/master/docker).
|
@@ -0,0 +1,20 @@
|
|
1
|
+
# Requirements
|
2
|
+
|
3
|
+
- Runtime:
|
4
|
+
- Ruby 2.7+ / 3.0+ (tested with 2.7, 3.0, 3.1 and 3.2)
|
5
|
+
- Database:
|
6
|
+
- SQLite3, PostgreSQL and MySQL
|
7
|
+
- Others:
|
8
|
+
- MISP
|
9
|
+
- TheHive
|
10
|
+
|
11
|
+
| Name | Supported versions |
|
12
|
+
| ---------- | ----------------------- |
|
13
|
+
| Ruby | v2.7, v3.0, v3.1 & v3.2 |
|
14
|
+
| PostgreSQL | v15 |
|
15
|
+
| SQLite | v3 |
|
16
|
+
| MySQL | v8 |
|
17
|
+
| MISP | v2.4 |
|
18
|
+
| TheHive | v3 & v4 |
|
19
|
+
|
20
|
+
You need to have a database to persistent the data. See [Database](./emitters/database.md) for details.
|
data/docs/rule.md
ADDED
@@ -0,0 +1,171 @@
|
|
1
|
+
# How to Write a Rule
|
2
|
+
|
3
|
+
Mihari has [Sigma](https://github.com/SigmaHQ/sigma) like format to describe a set of search queries to express a rule.
|
4
|
+
|
5
|
+
Mihari has three main components to compose a rule.
|
6
|
+
|
7
|
+
![](https://imgur.com/BBT99BG.png)
|
8
|
+
|
9
|
+
- Analyzers/Queries: a list of queries (analyzers) that builds a list of artifacts
|
10
|
+
- Enrichers: a list of enrichers that enriches a list of artifacts
|
11
|
+
- Emitters: a list of emitters that emits a list of artifacts as an alert
|
12
|
+
|
13
|
+
An artifact has five types:
|
14
|
+
|
15
|
+
- IP address (`ip`)
|
16
|
+
- Domain (`domain`)
|
17
|
+
- URL (`url`)
|
18
|
+
- Mail (`mail`)
|
19
|
+
- Hash (`hash`)
|
20
|
+
|
21
|
+
An alert can have multiple artifacts bundled by a rule.
|
22
|
+
|
23
|
+
!!! note
|
24
|
+
|
25
|
+
A rule is assumed to be executed multiple times continuously. An alert generated by a rule will only have new findings at that time.
|
26
|
+
|
27
|
+
Let's break down the following example:
|
28
|
+
|
29
|
+
```yaml
|
30
|
+
id: c7f6968e-dbe1-4612-b0bb-8407a4fe05df
|
31
|
+
title: Example
|
32
|
+
description: Mihari rule example
|
33
|
+
created_on: "2023-01-01"
|
34
|
+
updated_on: "2023-01-02"
|
35
|
+
author: ninoseki
|
36
|
+
references:
|
37
|
+
- https://github.com/ninoseki/mihari
|
38
|
+
related:
|
39
|
+
- 6254bb74-5e5d-42ad-bc1e-231da0293b0f
|
40
|
+
tags:
|
41
|
+
- foo
|
42
|
+
- bar
|
43
|
+
queries:
|
44
|
+
- analyzer: shodan
|
45
|
+
query: ip:1.1.1.1
|
46
|
+
- analyzer: censys
|
47
|
+
query: ip:8.8.8.8
|
48
|
+
enrichers:
|
49
|
+
- enricher: whois
|
50
|
+
- enricher: ipinfo
|
51
|
+
- enricher: shodan
|
52
|
+
- enricher: google_public_dns
|
53
|
+
emitters:
|
54
|
+
- emitter: database
|
55
|
+
- emitter: misp
|
56
|
+
- emitter: slack
|
57
|
+
- emitter: the_hive
|
58
|
+
data_types:
|
59
|
+
- hash
|
60
|
+
- ip
|
61
|
+
- domain
|
62
|
+
- url
|
63
|
+
- mail
|
64
|
+
falsepositives: []
|
65
|
+
```
|
66
|
+
|
67
|
+
## Components
|
68
|
+
|
69
|
+
### ID
|
70
|
+
|
71
|
+
`id` is an unique ID of a rule. UUID v4 is recommended.
|
72
|
+
|
73
|
+
### Title
|
74
|
+
|
75
|
+
`title` is a title of a rule.
|
76
|
+
|
77
|
+
### Description
|
78
|
+
|
79
|
+
`description` is a short description of a rule.
|
80
|
+
|
81
|
+
### Created/Updated On
|
82
|
+
|
83
|
+
`created_on` is a date of a rule creation. Optional.
|
84
|
+
Also a rule can have `updated_on` that is a date of a rule modification. Optional.
|
85
|
+
|
86
|
+
### Tags
|
87
|
+
|
88
|
+
`tags` is a list of tags of a rule.
|
89
|
+
|
90
|
+
### Author
|
91
|
+
|
92
|
+
`author` is an author of a rule. Optional.
|
93
|
+
|
94
|
+
### References
|
95
|
+
|
96
|
+
`references` is a list of a references of a rule. Optional.
|
97
|
+
|
98
|
+
### Related
|
99
|
+
|
100
|
+
`related` is a list of related rule IDs. Optional.
|
101
|
+
|
102
|
+
### Queries
|
103
|
+
|
104
|
+
`queries` is a list of queries/analyzers.
|
105
|
+
See [Analyzers](./analyzers/index.md) to know details of each analyzer.
|
106
|
+
|
107
|
+
### Enrichers
|
108
|
+
|
109
|
+
`enrichers` is a list of enrichers.
|
110
|
+
See [Enrichers](./enrichers/index.md) to know details of each enricher.
|
111
|
+
|
112
|
+
Defaults to:
|
113
|
+
|
114
|
+
- `google_public_dns`
|
115
|
+
- `ipinfo`
|
116
|
+
- `shodan`
|
117
|
+
- `whois`
|
118
|
+
|
119
|
+
### Emitters
|
120
|
+
|
121
|
+
`emitters` is a list of emitters.
|
122
|
+
See [Emitters](./emitters/index.md) to know details of each emitter.
|
123
|
+
|
124
|
+
Defaults to:
|
125
|
+
|
126
|
+
- `database`
|
127
|
+
- `misp`
|
128
|
+
- `slack`
|
129
|
+
- `the_hive`
|
130
|
+
|
131
|
+
### Data Types
|
132
|
+
|
133
|
+
`data_types` is a list of data (artifact) types to allow by a rule. Types not defined in here will be automatically rejected.
|
134
|
+
|
135
|
+
Defaults to:
|
136
|
+
|
137
|
+
- `ip`
|
138
|
+
- `domain`
|
139
|
+
- `url`
|
140
|
+
- `mail`
|
141
|
+
- `hash`
|
142
|
+
|
143
|
+
### False positives
|
144
|
+
|
145
|
+
`falsepositives` is a list of false positive values. A string or regexp can be used in here.
|
146
|
+
|
147
|
+
### Artifact TTL
|
148
|
+
|
149
|
+
`artifact_ttl` (alias: `artifact_lifetime`) is an integer value of artifact TTL (Time-To-Live) in seconds.
|
150
|
+
|
151
|
+
Mihari rejects a same artifact in a same rule in general.
|
152
|
+
|
153
|
+
But you may want to get a same artifact after a certain period of time. `artifact_ttl` is for that. If a rule finds a same artifact after `artifact_ttl` seconds have been passed, that artifact will be included in an alert.
|
154
|
+
|
155
|
+
## How to Run a Rule
|
156
|
+
|
157
|
+
Once you finish writing a rule, you can run the rule by `mihari` CLI.
|
158
|
+
|
159
|
+
!!! note
|
160
|
+
|
161
|
+
You have to initialize the database by `mihari db migrate` if you haven't already done.
|
162
|
+
|
163
|
+
```bash
|
164
|
+
mihari search /path/to/rule.yml
|
165
|
+
```
|
166
|
+
|
167
|
+
The command outputs an alert to the standard output. Also you can confirm it with a built-in web app.
|
168
|
+
|
169
|
+
```bash
|
170
|
+
mihari web
|
171
|
+
```
|
data/docs/tags.md
ADDED