dorothy2 1.1.0 → 1.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,15 @@
1
+ ---
2
+ !binary "U0hBMQ==":
3
+ metadata.gz: !binary |-
4
+ YzVhOTZjZDZjMjNiMThjNGJhZWM5ZGFhZmMyNTViZjk3NTVhNzAyMA==
5
+ data.tar.gz: !binary |-
6
+ NWI4ODIzZGU1NTJhZDA5ZDIxYTJjNjE2Mzk4ZmI5MmFlOGNiMGRmOQ==
7
+ SHA512:
8
+ metadata.gz: !binary |-
9
+ NWM0OTBlMDFjNmUwMTg4NjYzYWY3Zjc5NTdiOWRkYTUyOWM0YzY2Yjg0YzY3
10
+ MDg2YTFiOWZhZTU5YzEwODIzNzIxNzRmODcwMjBlODhhMTg3ZTk5YTJkYzYx
11
+ NGRmMGY5NjRhNzExYjJkMDg4ZWIyMWQyOWU2MzE2MDcyY2YxOWY=
12
+ data.tar.gz: !binary |-
13
+ ZGFmNWJiMDg2NmEwYTFjNzFjMWU0MGMxM2E5NGVlOTdjMWY1ZDJlZjBlYWVm
14
+ NWQxZmI4M2MzOWM0MzgyYzczNDNmNTgwY2I0ZTM2MGE5MTlhODg5NjA2ODcy
15
+ ZDgwZjA1NWNiOTc5NmY5NGVlZWIyYjQyNjMwZjEwM2JiMTVlM2U=
@@ -0,0 +1,14 @@
1
+ Dorothy 1.2.0
2
+
3
+ dorothy.yml
4
+ added sandbox’s network (needed by DEM)
5
+ added GeoIP.ISP
6
+
7
+ fix dparser
8
+ iconv deprecated
9
+ added GeoIP.ISP
10
+ removed lot of unused classes in DEM
11
+
12
+ dorothive
13
+ samples.hash -> sample.sha256
14
+ traffic_dumps.hash -> traffic_dumps.sha256
data/README.md CHANGED
@@ -50,7 +50,7 @@ very [modular](http://www.honeynet.it/wp-content/uploads/The_big_picture.pdf),an
50
50
  Dorothy needs the following software (not expressly in the same host) in order to be executed:
51
51
 
52
52
  * VMWare ESX >= 5.0 (tip: if you download ESXi, you can evaluate ESX for 30 days)
53
- * Ruby 1.8.7
53
+ * Ruby 1.9.3
54
54
  * Postgres >= 9.0
55
55
  * At least one Windows virtual machine
56
56
  * One unix-like machine dedicated to the Network Analysis Engine(NAM) (tcpdump/ssh needed)
@@ -89,7 +89,7 @@ It is recommended to follow this step2step process:
89
89
  * Configure a static IP
90
90
  * After configuring everything on the Guest OS, create a snapshot of the sandbox VM from vSphere console. Dorothy will use it when reverting the VM after a binary execution.
91
91
 
92
- 3. From vSphere, create a unix VM dedicated to the NAM
92
+ 4. From vSphere, create a unix VM dedicated to the NAM
93
93
 
94
94
 
95
95
  * Install tcpdump and sudo
@@ -113,7 +113,7 @@ It is recommended to follow this step2step process:
113
113
 
114
114
  * If you want to install pcapr on this machine (if you want to use dorohy from a MacOSX machine, you have to do it) install also these packages (refer to this blog [post](https://github.com/pcapr-local/pcapr-local) for a detailed howto). However, if you are installing Dorothy into a Linux machine, I recommended you to install pcapr on the same machine where the Dorothy gem was installed.
115
115
 
116
- #apt-get install ruby1.8 rubygems tshark zip couchdb
116
+ #apt-get install ruby1.9.3 rubygems tshark zip couchdb
117
117
 
118
118
  * Start the couchdb server
119
119
 
@@ -139,11 +139,15 @@ It is recommended to follow this step2step process:
139
139
 
140
140
  http//{ip-used-by-NAM}:8000
141
141
 
142
- 4 From vSphere, configure the NIC on the virtual machine that will be used for the network sniffing purpose (NAM).
142
+ 5 From vSphere, configure the NIC on the virtual machine that will be used for the network sniffing purpose (NAM).
143
143
  >The vSwitch where the vNIC resides must allow the promisc mode, to enable it from vSphere:
144
144
 
145
145
  >Configuration->Networking->Proprieties on the vistualSwitch used for the analysis->Double click on the virtual network used for the analysis->Securiry->Tick "Promiscuous Mode", then select "Accept" from the list menu.
146
146
 
147
+ >WARNING:
148
+ If you are virtualizing ESX from a Linux host machine, remember to give the right privileges to the network interface used by VM Player / Workstation in order [to allow](http://kb.vmware.com/selfservice/microsites/search.do?language=en_US&cmd=displayKC&externalId=287) promiscuous mode:
149
+
150
+ > chmod a+rw /dev/vmnet0
147
151
 
148
152
  #### * Sample Setups
149
153
  1. Basic setup
@@ -175,7 +179,7 @@ or
175
179
 
176
180
  3. Install the following packages
177
181
 
178
- $sudo apt-get install ruby1.8 rubygems postgresql-server-dev-9.1 libxml2-dev libxslt1-dev libmagic-dev
182
+ $sudo apt-get install ruby1.9.3 rubygems postgresql-server-dev-9.1 libxml2-dev libxslt1-dev libmagic-dev
179
183
 
180
184
  >For OSX users: all the above software are available through mac ports. A tip for libmagic: use brew instead:
181
185
  >
@@ -277,7 +281,7 @@ Below there are some tips about how understand the root-cause of your crash.
277
281
 
278
282
  >Example
279
283
 
280
- $cd /opt/local/lib/ruby/gems/1.8/gems/dorothy2-0.0.1/test/
284
+ $cd /opt/local/lib/ruby/gems/1.9.3/gems/dorothy2-0.0.1/test/
281
285
  $ruby tc_dorothy_full.rb
282
286
 
283
287
  2. Set the verbose flag (-v) while executing dorothy
data/TODO CHANGED
@@ -16,7 +16,7 @@
16
16
  -ListFileInGuest -> Create Files/Folder Baseline.
17
17
 
18
18
  -MANAGE SIG-INT WHILE MULTITHREAD
19
- -INTERACTIVE CONSOLE 10%
19
+ -INTERACTIVE CONSOLE 90%
20
20
  -ADD VNC CLIENT SPAWN IN MANUAL MODE
21
21
 
22
22
  -REVIEW DOROTHIVE (binary fullpath?)
data/UPDATE CHANGED
@@ -1,21 +1,19 @@
1
1
  #######################################
2
- #Updating from Dorothy 1.0.x to >= 1.0.9##
2
+ #Updating from Dorothy 1.0.x to >= 1.2.0##
3
3
  #######################################
4
4
 
5
- Dorothy 1.0.9 introduces several features that improve the overall framework.
5
+ Dorothy 1.2.0 introduces several features that improve the overall framework.
6
6
  Below, the recommended steps needed to update your Dorothy environment.
7
7
 
8
8
  a) Remove the Dorothy configuration file
9
9
  rm ~/.dorothy.yml
10
10
  And recreate it by restarting Dorothy. You will see that the init script will ask you more question than before.
11
11
 
12
- b) Since a new configuration file has been added in your Dorothy's etc/ folder (extension.yml), go and edit it
13
- accordingly to your environment.
12
+ b) The last version of Dorothy modified the dorothive schema in order to let dorothive compatible with Sinatra and Rails.
13
+ The columns modified are the following:
14
+ samples.hash -> sample.sha256
15
+ traffic_dumps.hash -> traffic_dumps.sha256
16
+ You can modify them manually if you have already a previous Dorothy version up and running, or drop the database and recreate it (-D) using the updated .ddl .
14
17
 
15
- c) From Dorothy home, execute the following SQL script in order to update the database schema. It will add the new table sys_procs.
16
-
17
- sudo -u postgres psql dorothive -f share/update_dorothive.sql
18
-
19
- That's all! You are ready to go!
20
18
 
21
19
 
@@ -216,9 +216,9 @@ begin
216
216
  rescue SignalException
217
217
  Dorothy.stop_running_analyses
218
218
  rescue => e
219
- puts "[" + "+".red + "] " + "[Dorothy]".yellow + " An error occurred: ".red + $!
220
- puts "[" + "+".red + "] " + "[Dorothy]".yellow + " For more information check the logfile" + $! if daemon
221
- LOGGER.error "Dorothy", "An error occurred: " + $!
219
+ puts "[" + "+".red + "] " + "[Dorothy]".yellow + " An error occurred: \n".red + e.inspect
220
+ puts "[" + "+".red + "] " + "[Dorothy]".yellow + " For more information check the logfile \n" + e.inspect if daemon
221
+ LOGGER.error "Dorothy", "An error occurred: \n" + e.inspect
222
222
  LOGGER.debug "Dorothy", "#{e.inspect} --BACKTRACE: #{e.backtrace}"
223
223
  LOGGER.info "Dorothy", "Dorothy has been stopped"
224
224
  end
@@ -9,6 +9,7 @@ require 'trollop'
9
9
  require 'dorothy2'
10
10
  require 'doroParser'
11
11
 
12
+ #load '../lib/dorothy2.rb'
12
13
  #load '../lib/doroParser.rb'
13
14
 
14
15
  include Dorothy
@@ -63,28 +64,30 @@ LOGGER_PARSER.sev_threshold = DoroSettings.env[:loglevel]
63
64
  LOGGER = DoroLogger.new(logout, DoroSettings.env[:logage])
64
65
  LOGGER.sev_threshold = DoroSettings.env[:loglevel]
65
66
 
66
- if system "sh -c 'type startpcapr > /dev/null 2>&1'"
67
- pcapr_conf = "#{File.expand_path("~")}/.pcapr_local/config"
68
- unless Util.exists?(pcapr_conf)
69
- puts "[WARNING]".red + " Pcapr conf not found at #{File.expand_path("~")}/.pcapr_local/config "
70
- puts "[WARNING]".red + " Although you have configured Dorothy in order to look for a *local* Pcapr instance,it seems that it is not configured yet,so please run \"startpcapr\" and configure it."
67
+ if DoroSettings.pcapr[:local]=="true"
68
+ if system "sh -c 'type startpcapr > /dev/null 2>&1'"
69
+ pcapr_conf = "#{File.expand_path("~")}/.pcapr_local/config"
70
+ unless Util.exists?(pcapr_conf)
71
+ puts "[WARNING]".red + " Pcapr conf not found at #{File.expand_path("~")}/.pcapr_local/config "
72
+ puts "[WARNING]".red + " Although you have configured Dorothy in order to look for a *local* Pcapr instance,it seems that it is not configured yet,so please run \"startpcapr\" and configure it."
73
+ exit(1)
74
+ end
75
+ else
76
+ puts "[WARNING]".red + " Although you have configured Dorothy in order to look for a *local* Pcapr instance, it seems *NOT INSTALLED* in your system.\n\t Please install it by typing \"sudo gem install pcapr-local\. Then set Pcapr to scan #{DoroSettings.env[:analysis_dir]}"
71
77
  exit(1)
72
78
  end
73
- else
74
- puts "[WARNING]".red + "Although you have configured Dorothy in order to look for a *local* Pcapr instance, it seems *NOT INSTALLED* in your system.\n\t Please install it by typing \"sudo gem install pcapr-local\. Then set Pcapr to scan #{DoroSettings.env[:analysis_dir]}"
75
- exit(1)
76
79
  end
77
80
 
78
81
 
79
82
  begin
80
83
  DoroParser.start(daemon)
81
84
  rescue => e
82
- puts "[PARSER]".yellow + " An error occurred: ".red + $!
85
+ puts "[PARSER]".yellow + " An error occurred: ".red + e.inspect
83
86
  if daemon
84
- puts "[PARSER]".yellow + " For more information check the logfile" + $!
87
+ puts "[PARSER]".yellow + " For more information check the logfile" + e.inspect
85
88
  puts "[PARSER]".yellow + "Dorothy-Parser has been stopped"
86
89
  end
87
- LOGGER_PARSER.error "Parser", "An error occurred: " + $!
90
+ LOGGER_PARSER.error "Parser", "An error occurred: " + e.inspect
88
91
  LOGGER_PARSER.debug "Parser", "#{e.inspect} --BACKTRACE: #{e.backtrace}"
89
92
  LOGGER_PARSER.info "Parser", "Dorothy-Parser has been stopped"
90
93
  end
@@ -11,6 +11,9 @@ require 'dorothy2'
11
11
  require 'doroParser'
12
12
 
13
13
  #load '../lib/doroParser.rb'
14
+ #load '../lib/dorothy2.rb'
15
+
16
+
14
17
 
15
18
  include Dorothy
16
19
  include DoroParser
@@ -16,6 +16,7 @@ Gem::Specification.new do |gem|
16
16
  gem.test_files = gem.files.grep(%r{^(test|spec|features)/})
17
17
  gem.extra_rdoc_files = ["README.md"]
18
18
  gem.require_paths = ["lib"]
19
+ gem.required_ruby_version = '>= 1.9.3'
19
20
  gem.add_dependency(%q<net-scp>, [">= 1.0.4"])
20
21
  gem.add_dependency(%q<net-ssh>, [">= 2.2.1"])
21
22
  gem.add_dependency(%q<trollop>, [">= 1.16.2"])
@@ -31,6 +32,6 @@ Gem::Specification.new do |gem|
31
32
  gem.add_dependency(%q<net-dns>, [">= 0.8.0"])
32
33
  gem.add_dependency(%q<geoip>, [">= 1.2.1"])
33
34
  gem.add_dependency(%q<tmail>, [">= 1.2.7.1"])
34
- gem.post_install_message = 'If you are upgrating from a previous version, read the UPDATE file!'
35
+ gem.post_install_message = '\n WARING: If you are upgrating from a previous version, read the UPDATE file!\n'
35
36
  end
36
37
 
@@ -223,7 +223,7 @@ SELECT pg_catalog.setval('analyses_id_seq', 1, true);
223
223
  --
224
224
 
225
225
  CREATE TABLE samples (
226
- hash character(64) NOT NULL,
226
+ sha256 character(64) NOT NULL,
227
227
  size integer NOT NULL,
228
228
  path character(256),
229
229
  filename character(256),
@@ -246,7 +246,7 @@ COMMENT ON TABLE samples IS 'Acquired samples';
246
246
  -- Name: COLUMN samples.hash; Type: COMMENT; Schema: dorothy; Owner: postgres
247
247
  --
248
248
 
249
- COMMENT ON COLUMN samples.hash IS 'SHA256 checksum hash';
249
+ COMMENT ON COLUMN samples.sha256 IS 'SHA256 checksum hash';
250
250
 
251
251
 
252
252
  --
@@ -267,7 +267,7 @@ COMMENT ON CONSTRAINT size_notneg ON samples IS 'Sample size must not be negativ
267
267
  --
268
268
 
269
269
  CREATE TABLE traffic_dumps (
270
- hash character(64) NOT NULL,
270
+ sha256 character(64) NOT NULL,
271
271
  size integer NOT NULL,
272
272
  pcapr_id character(32),
273
273
  "binary" character varying,
@@ -281,7 +281,7 @@ ALTER TABLE dorothy.traffic_dumps OWNER TO postgres;
281
281
  -- Name: COLUMN traffic_dumps.hash; Type: COMMENT; Schema: dorothy; Owner: postgres
282
282
  --
283
283
 
284
- COMMENT ON COLUMN traffic_dumps.hash IS 'SHA256 checksum hash';
284
+ COMMENT ON COLUMN traffic_dumps.sha256 IS 'SHA256 checksum hash';
285
285
 
286
286
 
287
287
  --
@@ -289,7 +289,7 @@ COMMENT ON COLUMN traffic_dumps.hash IS 'SHA256 checksum hash';
289
289
  --
290
290
 
291
291
  CREATE VIEW analysis_resume_view AS
292
- SELECT analyses.id, samples.filename, samples.md5, samples.long_type, analyses.date, traffic_dumps.parsed FROM traffic_dumps, samples, analyses WHERE ((analyses.sample = samples.hash) AND (analyses.traffic_dump = traffic_dumps.hash)) ORDER BY analyses.id DESC;
292
+ SELECT analyses.id, samples.filename, samples.md5, samples.long_type, analyses.date, traffic_dumps.parsed FROM traffic_dumps, samples, analyses WHERE ((analyses.sample = samples.sha256) AND (analyses.traffic_dump = traffic_dumps.sha256)) ORDER BY analyses.id DESC;
293
293
 
294
294
 
295
295
  ALTER TABLE dorothy.analysis_resume_view OWNER TO postgres;
@@ -545,7 +545,7 @@ ALTER TABLE dorothy.roles OWNER TO postgres;
545
545
  --
546
546
 
547
547
  CREATE VIEW ccprofile_view3 AS
548
- SELECT DISTINCT host_ips.id AS hostid, host_ips.ip, flows.dstport, traffic_dumps.hash, irc_data.id, roles.type, dns_data.name, irc_data.data FROM roles, host_roles, host_ips, dns_data, flows, irc_data, traffic_dumps WHERE (((((((((roles.id = host_roles.role) AND (host_roles.host_ip = host_ips.ip)) AND (dns_data.id = host_ips.dns_name)) AND (flows.dest = host_ips.ip)) AND (flows.traffic_dump = traffic_dumps.hash)) AND (irc_data.flow = flows.id)) AND (irc_data.incoming = false)) AND (host_ips.is_online = true)) AND ((roles.type)::text = 'cc-irc'::text)) ORDER BY irc_data.id, host_ips.id, host_ips.ip, flows.dstport, traffic_dumps.hash, roles.type, dns_data.name, irc_data.data;
548
+ SELECT DISTINCT host_ips.id AS hostid, host_ips.ip, flows.dstport, traffic_dumps.sha256, irc_data.id, roles.type, dns_data.name, irc_data.data FROM roles, host_roles, host_ips, dns_data, flows, irc_data, traffic_dumps WHERE (((((((((roles.id = host_roles.role) AND (host_roles.host_ip = host_ips.ip)) AND (dns_data.id = host_ips.dns_name)) AND (flows.dest = host_ips.ip)) AND (flows.traffic_dump = traffic_dumps.sha256)) AND (irc_data.flow = flows.id)) AND (irc_data.incoming = false)) AND (host_ips.is_online = true)) AND ((roles.type)::text = 'cc-irc'::text)) ORDER BY irc_data.id, host_ips.id, host_ips.ip, flows.dstport, traffic_dumps.sha256, roles.type, dns_data.name, irc_data.data;
549
549
 
550
550
 
551
551
  ALTER TABLE dorothy.ccprofile_view3 OWNER TO postgres;
@@ -1079,21 +1079,16 @@ SELECT pg_catalog.setval('whois_id_seq', 1, false);
1079
1079
  -- Name: sys_procs; Type: TABLE; Schema: dorothy; Owner: postgres; Tablespace:
1080
1080
  --
1081
1081
 
1082
- CREATE TABLE dorothy.sys_procs
1083
- (
1084
- analysis_id integer NOT NULL,
1085
- pid integer NOT NULL,
1086
- name character varying,
1087
- owner character varying,
1088
- "cmdLine" character varying,
1089
- "startTime" timestamp without time zone,
1090
- "endTime" timestamp without time zone,
1091
- "exitCode" integer,
1092
- CONSTRAINT "procs-pk" PRIMARY KEY (analysis_id , pid ),
1093
- CONSTRAINT "anal_id-fk" FOREIGN KEY (analysis_id)
1094
- REFERENCES dorothy.analyses (id) MATCH SIMPLE
1095
- ON UPDATE NO ACTION ON DELETE NO ACTION
1096
- )
1082
+ CREATE TABLE sys_procs (
1083
+ analysis_id integer NOT NULL,
1084
+ pid integer NOT NULL,
1085
+ name character varying,
1086
+ owner character varying,
1087
+ "cmdLine" character varying,
1088
+ "startTime" timestamp without time zone,
1089
+ "endTime" timestamp without time zone,
1090
+ "exitCode" integer
1091
+ );
1097
1092
 
1098
1093
 
1099
1094
  ALTER TABLE dorothy.sys_procs OWNER TO postgres;
@@ -1320,7 +1315,7 @@ COPY roles (id, type, comment) FROM stdin;
1320
1315
  -- Data for Name: samples; Type: TABLE DATA; Schema: dorothy; Owner: postgres
1321
1316
  --
1322
1317
 
1323
- COPY samples (hash, size, path, filename, md5, long_type) FROM stdin;
1318
+ COPY samples (sha256, size, path, filename, md5, long_type) FROM stdin;
1324
1319
  \.
1325
1320
 
1326
1321
 
@@ -1355,7 +1350,7 @@ COPY sightings (sample, sensor, date, traffic_dump) FROM stdin;
1355
1350
  -- Data for Name: traffic_dumps; Type: TABLE DATA; Schema: dorothy; Owner: postgres
1356
1351
  --
1357
1352
 
1358
- COPY traffic_dumps (hash, size, pcapr_id, "binary", parsed) FROM stdin;
1353
+ COPY traffic_dumps (sha256, size, pcapr_id, "binary", parsed) FROM stdin;
1359
1354
  EMPTYPCAP 0 ffff ffff true
1360
1355
  \.
1361
1356
 
@@ -1420,7 +1415,7 @@ ALTER TABLE ONLY geoinfo
1420
1415
  --
1421
1416
 
1422
1417
  ALTER TABLE ONLY samples
1423
- ADD CONSTRAINT hash PRIMARY KEY (hash);
1418
+ ADD CONSTRAINT sha256 PRIMARY KEY (sha256);
1424
1419
 
1425
1420
 
1426
1421
  --
@@ -1486,6 +1481,12 @@ ALTER TABLE ONLY host_ips
1486
1481
  ALTER TABLE ONLY irc_data
1487
1482
  ADD CONSTRAINT pk_irc PRIMARY KEY (id);
1488
1483
 
1484
+ --
1485
+ -- Name: procs-pk; Type: CONSTRAINT; Schema: dorothy; Owner: postgres; Tablespace:
1486
+ --
1487
+
1488
+ ALTER TABLE ONLY sys_procs
1489
+ ADD CONSTRAINT "procs-pk" PRIMARY KEY (analysis_id, pid);
1489
1490
 
1490
1491
  --
1491
1492
  -- Name: reports_pkey; Type: CONSTRAINT; Schema: dorothy; Owner: postgres; Tablespace:
@@ -1523,7 +1524,7 @@ ALTER TABLE ONLY sensors
1523
1524
  --
1524
1525
 
1525
1526
  ALTER TABLE ONLY traffic_dumps
1526
- ADD CONSTRAINT traffic_dumps_pkey PRIMARY KEY (hash);
1527
+ ADD CONSTRAINT traffic_dumps_pkey PRIMARY KEY (sha256);
1527
1528
 
1528
1529
 
1529
1530
  --
@@ -1639,6 +1640,12 @@ CREATE INDEX fki_shash ON reports USING btree (sample);
1639
1640
 
1640
1641
  CREATE INDEX fki_tdumps ON analyses USING btree (traffic_dump);
1641
1642
 
1643
+ --
1644
+ -- Name: anal_id-fk; Type: FK CONSTRAINT; Schema: dorothy; Owner: postgres
1645
+ --
1646
+
1647
+ ALTER TABLE ONLY sys_procs
1648
+ ADD CONSTRAINT "anal_id-fk" FOREIGN KEY (analysis_id) REFERENCES analyses(id);
1642
1649
 
1643
1650
  --
1644
1651
  -- Name: dest_ip; Type: FK CONSTRAINT; Schema: dorothy; Owner: postgres
@@ -1661,7 +1668,7 @@ ALTER TABLE ONLY host_ips
1661
1668
  --
1662
1669
 
1663
1670
  ALTER TABLE ONLY flows
1664
- ADD CONSTRAINT dumps FOREIGN KEY (traffic_dump) REFERENCES traffic_dumps(hash);
1671
+ ADD CONSTRAINT dumps FOREIGN KEY (traffic_dump) REFERENCES traffic_dumps(sha256);
1665
1672
 
1666
1673
 
1667
1674
  --
@@ -1669,7 +1676,7 @@ ALTER TABLE ONLY flows
1669
1676
  --
1670
1677
 
1671
1678
  ALTER TABLE ONLY malwares
1672
- ADD CONSTRAINT fk_bin FOREIGN KEY (bin) REFERENCES samples(hash);
1679
+ ADD CONSTRAINT fk_bin FOREIGN KEY (bin) REFERENCES samples(sha256);
1673
1680
 
1674
1681
 
1675
1682
  --
@@ -1741,7 +1748,7 @@ ALTER TABLE ONLY host_roles
1741
1748
  --
1742
1749
 
1743
1750
  ALTER TABLE ONLY analyses
1744
- ADD CONSTRAINT samples FOREIGN KEY (sample) REFERENCES samples(hash);
1751
+ ADD CONSTRAINT samples FOREIGN KEY (sample) REFERENCES samples(sha256);
1745
1752
 
1746
1753
 
1747
1754
  --
@@ -1749,7 +1756,7 @@ ALTER TABLE ONLY analyses
1749
1756
  --
1750
1757
 
1751
1758
  ALTER TABLE ONLY sightings
1752
- ADD CONSTRAINT samples FOREIGN KEY (sample) REFERENCES samples(hash);
1759
+ ADD CONSTRAINT samples FOREIGN KEY (sample) REFERENCES samples(sha256);
1753
1760
 
1754
1761
 
1755
1762
  --
@@ -1765,7 +1772,7 @@ ALTER TABLE ONLY sightings
1765
1772
  --
1766
1773
 
1767
1774
  ALTER TABLE ONLY reports
1768
- ADD CONSTRAINT shash FOREIGN KEY (sample) REFERENCES samples(hash);
1775
+ ADD CONSTRAINT shash FOREIGN KEY (sample) REFERENCES samples(sha256);
1769
1776
 
1770
1777
 
1771
1778
  --
@@ -1773,7 +1780,7 @@ ALTER TABLE ONLY reports
1773
1780
  --
1774
1781
 
1775
1782
  ALTER TABLE ONLY analyses
1776
- ADD CONSTRAINT tdumps FOREIGN KEY (traffic_dump) REFERENCES traffic_dumps(hash);
1783
+ ADD CONSTRAINT tdumps FOREIGN KEY (traffic_dump) REFERENCES traffic_dumps(sha256);
1777
1784
 
1778
1785
 
1779
1786
  --
@@ -1797,4 +1804,3 @@ GRANT ALL ON SCHEMA dorothy TO PUBLIC;
1797
1804
  --
1798
1805
  -- PostgreSQL database dump complete
1799
1806
  --
1800
-
@@ -12,6 +12,13 @@ exe:
12
12
  prog_path: C:\windows\system32\cmd.exe
13
13
  prog_args: /C
14
14
 
15
+
16
+ bat:
17
+ prog_name: Windows CMD.exe
18
+ prog_path: C:\windows\system32\cmd.exe
19
+ prog_args: /C
20
+
21
+
15
22
  dll:
16
23
  prog_name: Windows Rundll32.exe
17
24
  prog_path: C:\windows\system32\rundll32.exe
@@ -2,10 +2,8 @@
2
2
  # This file is part of Dorothy - http://www.honeynet.it/
3
3
  # See the file 'LICENSE' for copying permission.
4
4
 
5
- #!/usr/local/bin/ruby
6
5
 
7
6
  #load 'lib/doroParser.rb'; include Dorothy; include DoroParser; LOGGER = DoroLogger.new(STDOUT, "weekly")
8
-
9
7
  #Install mu/xtractr from svn checkout http://pcapr.googlecode.com/svn/trunk/ pcapr-read-only
10
8
 
11
9
 
@@ -15,26 +13,21 @@
15
13
  ## Data Definition Module ##
16
14
  ############################
17
15
 
18
-
19
- require 'rubygems'
20
- require 'md5'
16
+ require 'digest'
21
17
  require 'rbvmomi'
22
18
  require 'rest_client'
23
19
  require 'net/dns'
24
20
  require 'net/dns/packet'
25
21
  require 'ipaddr'
26
22
  require 'colored'
27
- require 'ftools'
28
- require 'filemagic' #require 'pcaplet'
23
+ require 'filemagic'
29
24
  require 'geoip'
30
25
  require 'pg'
31
- require 'iconv'
32
26
  require 'tmail'
33
27
  require 'ipaddr'
34
28
  require 'net/http'
35
29
  require 'json'
36
30
 
37
- require File.dirname(__FILE__) + '/dorothy2/environment'
38
31
  require File.dirname(__FILE__) + '/mu/xtractr'
39
32
  require File.dirname(__FILE__) + '/dorothy2/DEM'
40
33
  require File.dirname(__FILE__) + '/dorothy2/do-utils'
@@ -72,13 +65,13 @@ module DoroParser
72
65
  pcaps.each do |dump|
73
66
  #RETRIVE MALWARE FILE INFO
74
67
 
75
- !dump['sample'].nil? && !dump['hash'].nil? && !dump['pcapr_id'].nil? or next
68
+ !dump['sample'].nil? && !dump['sha256'].nil? && !dump['pcapr_id'].nil? or next
76
69
 
77
70
  LOGGER_PARSER.info "PARSER", "Analyzing file: ".yellow + dump['sample']
78
71
  LOGGER_PARSER.info "PARSER", "Analyzing pcaprid: ".yellow + dump['pcapr_id'].gsub(/\s+/, "")
79
72
 
80
73
 
81
- LOGGER_PARSER.debug "PARSER", "Analyzing dump: ".yellow + dump['hash'].gsub(/\s+/, "") if VERBOSE
74
+ LOGGER_PARSER.debug "PARSER", "Analyzing dump: ".yellow + dump['sha256'].gsub(/\s+/, "") if VERBOSE
82
75
 
83
76
  downloadir = "#{DoroSettings.env[:analysis_dir]}/#{dump['anal_id']}/downloads"
84
77
 
@@ -90,7 +83,7 @@ module DoroParser
90
83
 
91
84
  rescue => e
92
85
  LOGGER_PARSER.fatal "PARSER", "Can't connect to the PCAPR server."
93
- LOGGER_PARSER.debug "PARSER", "#{$!}"
86
+ LOGGER_PARSER.debug "PARSER", "#{e.inspect}"
94
87
  LOGGER_PARSER.debug "PARSER", e.backtrace if VERBOSE
95
88
  return false
96
89
  end
@@ -184,7 +177,7 @@ module DoroParser
184
177
 
185
178
  #case TCP xtractr.flows('flow.service:SMTP').first.proto = 6
186
179
 
187
- flowvals = [flow.src.address, flow.dst.address, flow.sport, flow.dport, flow.bytes, dump['hash'], flow.packets, "default", flow.proto, flow.service.name, title, "null", flow.duration, flow.time, flow.id ]
180
+ flowvals = [flow.src.address, flow.dst.address, flow.sport, flow.dport, flow.bytes, dump['sha256'], flow.packets, "default", flow.proto, flow.service.name, title, "null", flow.duration, flow.time, flow.id ]
188
181
 
189
182
  if !@insertdb.insert("flows",flowvals)
190
183
  LOGGER_PARSER.info "PARSER", "Skipping flow #{flow.id}: #{flow.src.address} > #{flow.dst.address}"
@@ -390,7 +383,7 @@ module DoroParser
390
383
  rescue => e
391
384
 
392
385
  LOGGER_PARSER.error "DB", "Something went wrong while adding a DNS entry into the DB (packet malformed?) - The packet will be skipped"
393
- LOGGER_PARSER.debug "DB", "#{$!}" if VERBOSE
386
+ LOGGER_PARSER.debug "DB", "#{e.inspect}" if VERBOSE
394
387
  LOGGER_PARSER.debug "DB", e if VERBOSE
395
388
  end
396
389
 
@@ -420,7 +413,7 @@ module DoroParser
420
413
  #DEBUG
421
414
  #puts "save?"
422
415
  #gets
423
- @insertdb.set_analyzed(dump['hash'])
416
+ @insertdb.set_analyzed(dump['sha256'])
424
417
  @insertdb.commit
425
418
  end
426
419
  end