whodunit-chronicles 0.1.0.pre β†’ 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: 527e3470b4156a28e1972948cc79cbe4ce2ee72ec7582692434e98611c5e0071
4
- data.tar.gz: 51302c24757877422d8ec97f894aa4af06561d9d53fa60ac65f6bc94d4b89705
3
+ metadata.gz: ddd2ff0c636f7842e53659f39138f326f14eb69227688c0fc5170e8db4868f10
4
+ data.tar.gz: f8d7514ce651c0bb7431740897940083488c0ed24c2e896d54d509677b2cbbd1
5
5
  SHA512:
6
- metadata.gz: 5ce050e09a6f0407c2d5c64b2196910f84834fc17408a3477fd082fbab04a346fb69dbc43d28fbd1d8868f25915206c6af6b9ac1ffe52700d7c0b2e9e46916b9
7
- data.tar.gz: cea9a584c9fb99306a6cbcbdd8afc87f23f14db4ec6e90f96c8364006defb869ce3abd69fb72a02a4e4a79ce9144494e96c47dacca27e9ef669f4bb0ff44558a
6
+ metadata.gz: 7654d23d4e932046c0408c513a18e68f0e4c46856b0aed507f48eea3ed9744838b49ce0b19a4ff0b1853ff156cea64f8ffefa55af5b1e788bce597d8c662be11
7
+ data.tar.gz: d5a62ae1a2893b65593aa93216377c0d3f6fbbdc28b90362acc9cf8ceeec159e296335014efb3fa77e645a4cc9d5cf8392e81991613fcb27a8ffc9d953f57f64
data/.rubocop.yml CHANGED
@@ -7,6 +7,7 @@ AllCops:
7
7
  - 'tmp/**/*'
8
8
  - 'bin/**/*'
9
9
  - 'node_modules/**/*'
10
+ - 'examples/**/*'
10
11
 
11
12
  # Use plugins for extensions
12
13
  plugins:
@@ -89,4 +90,4 @@ Performance/Casecmp:
89
90
  Enabled: true
90
91
 
91
92
  Performance/StringReplacement:
92
- Enabled: true
93
+ Enabled: true
data/CHANGELOG.md CHANGED
@@ -5,8 +5,6 @@ All notable changes to this project will be documented in this file.
5
5
  The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
6
6
  and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
7
7
 
8
- ## [Unreleased]
9
-
10
8
  ### Added
11
9
 
12
10
  - Comprehensive GitHub Actions CI/CD pipeline with multi-Ruby testing
data/README.md CHANGED
@@ -8,6 +8,8 @@
8
8
 
9
9
  > **The complete historical record of your _Whodunit Dun Wat?_ data**
10
10
 
11
+ > **πŸ’‘ Origin Story:** Chronicles is inspired by the challenge of streaming database changes for real-time analytics without impacting application performance. The concept proved so effective in a previous project that it became the foundation for this Ruby implementation.
12
+
11
13
  While [Whodunit](https://github.com/kanutocd/whodunit) tracks _who_ made changes, **Chronicles** captures _what_ changed by streaming database events into comprehensive audit trails with **zero Rails application overhead**.
12
14
 
13
15
  ## ✨ Features
@@ -18,10 +20,155 @@ While [Whodunit](https://github.com/kanutocd/whodunit) tracks _who_ made changes
18
20
  - **⚑ Thread-Safe**: Concurrent processing with configurable thread pools
19
21
  - **πŸ›‘οΈ Resilient**: Built-in error handling, retry logic, and monitoring
20
22
  - **πŸ“Š Complete Audit Trail**: Captures INSERT, UPDATE, DELETE with full before/after data
21
- - **πŸ§ͺ VERY Soon to be Production Ready**: 94%+ test coverage with comprehensive error scenarios
23
+ - **πŸ§ͺ Code Coverage**: 94%+ test coverage with comprehensive error scenarios
22
24
 
23
25
  ## πŸš€ Quick Start
24
26
 
27
+ ### 🎯 Usage Scenarios
28
+
29
+ Chronicles excels at transforming database changes into business intelligence. Here are two common patterns:
30
+
31
+ #### 1. Basic Audit Trail Integration
32
+
33
+ ---
34
+
35
+ Perfect for applications that need comprehensive change tracking alongside Whodunit's user attribution:
36
+
37
+ ```ruby
38
+ # Basic setup for user activity tracking
39
+ class BasicAuditProcessor < Whodunit::Chronicles::AuditProcessor
40
+ def build_chronicles_record(change_event)
41
+ super.tap do |record|
42
+ # Add basic business context
43
+ record[:change_category] = categorize_change(change_event)
44
+ record[:business_impact] = assess_impact(change_event)
45
+ end
46
+ end
47
+
48
+ private
49
+
50
+ def categorize_change(change_event)
51
+ case change_event.table_name
52
+ when 'users' then 'user_management'
53
+ when 'posts' then 'content'
54
+ when 'comments' then 'engagement'
55
+ else 'system'
56
+ end
57
+ end
58
+ end
59
+ ```
60
+
61
+ **Use Case**: Blog platform tracking user posts and comments for community management and content moderation.
62
+
63
+ #### 2. Advanced Recruitment Analytics
64
+
65
+ Sophisticated business intelligence for talent acquisition platforms:
66
+
67
+ ```ruby
68
+ # Advanced processor for recruitment metrics
69
+ class RecruitmentAnalyticsProcessor < Whodunit::Chronicles::AuditProcessor
70
+ def build_chronicles_record(change_event)
71
+ super.tap do |record|
72
+ # Add recruitment-specific business metrics
73
+ record[:recruitment_stage] = determine_stage(change_event)
74
+ record[:funnel_position] = calculate_funnel_position(change_event)
75
+ record[:time_to_hire_impact] = assess_time_impact(change_event)
76
+ record[:cost_per_hire_impact] = calculate_cost_impact(change_event)
77
+
78
+ # Campaign attribution
79
+ record[:utm_source] = extract_utm_source(change_event)
80
+ record[:campaign_id] = extract_campaign_id(change_event)
81
+
82
+ # Quality metrics
83
+ record[:candidate_quality_score] = assess_candidate_quality(change_event)
84
+ end
85
+ end
86
+
87
+ def process(change_event)
88
+ record = build_chronicles_record(change_event)
89
+ store_audit_record(record)
90
+
91
+ # Stream to analytics platforms
92
+ stream_to_prometheus(record) if track_metrics?
93
+ update_grafana_dashboard(record)
94
+ trigger_real_time_alerts(record) if alert_worthy?(record)
95
+ end
96
+
97
+ private
98
+
99
+ def determine_stage(change_event)
100
+ return 'unknown' unless change_event.table_name == 'applications'
101
+
102
+ case change_event.new_data&.dig('status')
103
+ when 'submitted' then 'application'
104
+ when 'screening', 'in_review' then 'screening'
105
+ when 'interview_scheduled', 'interviewed' then 'interview'
106
+ when 'offer_extended', 'offer_accepted' then 'offer'
107
+ when 'hired' then 'hire'
108
+ else 'unknown'
109
+ end
110
+ end
111
+
112
+ def stream_to_prometheus(record)
113
+ # Track key recruitment metrics
114
+ RECRUITMENT_APPLICATIONS_TOTAL.increment(
115
+ source: record[:utm_source],
116
+ department: record.dig(:new_data, 'department')
117
+ )
118
+
119
+ if record[:action] == 'UPDATE' && status_changed_to_hired?(record)
120
+ RECRUITMENT_HIRES_TOTAL.increment(
121
+ source: record[:utm_source],
122
+ time_to_hire: record[:time_to_hire_impact]
123
+ )
124
+ end
125
+ end
126
+
127
+ def update_grafana_dashboard(record)
128
+ # Send time-series data for Grafana visualization
129
+ InfluxDB.write_point('recruitment_events', {
130
+ timestamp: record[:occurred_at],
131
+ table: record[:table_name],
132
+ action: record[:action],
133
+ stage: record[:recruitment_stage],
134
+ source: record[:utm_source],
135
+ cost_impact: record[:cost_per_hire_impact],
136
+ quality_score: record[:candidate_quality_score]
137
+ })
138
+ end
139
+ end
140
+ ```
141
+
142
+ **Use Case**: Imagine a Spherical Cow Talent acquisition platform tracking candidate journey from application through hire, with real-time dashboards showing conversion rates, time-to-hire, cost-per-hire, and source effectiveness.
143
+
144
+ #### πŸ“Š Visual Analytics Dashboard
145
+
146
+ The recruitment analytics processor creates comprehensive Grafana dashboards for executive reporting and operational insights:
147
+
148
+ <div align="center">
149
+
150
+ **Campaign Performance Analytics**
151
+ <a href="examples/images/campaign-performance-analytics.png" title="Click to view full size image">
152
+ <img src="examples/images/campaign-performance-analytics.png" width="300" />
153
+ </a>
154
+ *Track campaign ROI, cost-per-hire by channel, and conversion rates across marketing sources*
155
+
156
+ **Candidate Journey Analytics**
157
+ <a href="examples/images/candidate-journey-analytics.png" title="Click to view full size image">
158
+ <img src="examples/images/candidate-journey-analytics.png" width="300" />
159
+ </a>
160
+ *Monitor candidate engagement, funnel conversion rates, and application completion patterns*
161
+
162
+ **Recruitment Funnel Analytics**
163
+ <a href="examples/images/recruitment-funnel-analytics.png" title="Click to view full size image">
164
+ <img src="examples/images/recruitment-funnel-analytics.png" width="300" />
165
+ </a>
166
+ *Analyze hiring pipeline progression, department performance, and time-series trends*
167
+
168
+ </div>
169
+
170
+ These dashboards are automatically populated by Chronicles as candidates move through your hiring funnel, providing real-time visibility into recruitment performance without any manual data entry.
171
+
25
172
  ### Installation
26
173
 
27
174
  Add to your Gemfile:
@@ -150,30 +297,162 @@ Chronicles creates structured audit records for each database change:
150
297
 
151
298
  ## πŸ”§ Advanced Usage
152
299
 
153
- ### Custom Audit Processing
300
+ ### Custom Processors for Analytics & Monitoring
301
+
302
+ **The real power of Chronicles** comes from creating custom processors tailored for your specific analytics needs. While Whodunit captures basic "who changed what," Chronicles lets you build sophisticated data pipelines for tools like **Grafana**, **DataDog**, or **Elasticsearch**.
303
+
304
+ Transform database changes into actionable business intelligence with features like:
305
+
306
+ - **25+ Custom Metrics**: Track business KPIs like conversion rates, time-to-hire, and cost-per-acquisition
307
+ - **Real-time Dashboards**: Stream data to Grafana for executive reporting and operational monitoring
308
+ - **Smart Alerting**: Trigger notifications based on business rules and thresholds
309
+ - **Multi-destination Streaming**: Send data simultaneously to multiple analytics platforms
310
+
311
+ #### Analytics-Focused Processor
154
312
 
155
313
  ```ruby
156
- class MyCustomProcessor < Whodunit::Chronicles::AuditProcessor
314
+ class AnalyticsProcessor < Whodunit::Chronicles::AuditProcessor
157
315
  def build_chronicles_record(change_event)
158
316
  super.tap do |record|
159
- record[:custom_field] = extract_custom_data(change_event)
160
- record[:environment] = Rails.env
317
+ # Add business metrics
318
+ record[:business_impact] = calculate_business_impact(change_event)
319
+ record[:user_segment] = determine_user_segment(change_event)
320
+ record[:feature_flag] = current_feature_flags
321
+
322
+ # Add performance metrics
323
+ record[:change_size] = calculate_change_size(change_event)
324
+ record[:peak_hours] = during_peak_hours?
325
+ record[:geographic_region] = user_region(change_event)
326
+
327
+ # Add time-series friendly fields for Grafana
328
+ record[:hour_of_day] = Time.current.hour
329
+ record[:day_of_week] = Time.current.wday
330
+ record[:is_weekend] = weekend?
331
+
332
+ # Custom tagging for filtering
333
+ record[:tags] = generate_tags(change_event)
334
+ end
335
+ end
336
+
337
+ private
338
+
339
+ def calculate_business_impact(change_event)
340
+ case change_event.table_name
341
+ when 'orders' then 'revenue_critical'
342
+ when 'users' then 'customer_critical'
343
+ when 'products' then 'inventory_critical'
344
+ else 'standard'
161
345
  end
162
346
  end
163
347
 
348
+ def determine_user_segment(change_event)
349
+ return 'anonymous' unless change_event.user_id
350
+
351
+ # Look up user tier from your business logic
352
+ User.find(change_event.user_id)&.tier || 'standard'
353
+ end
354
+
355
+ def generate_tags(change_event)
356
+ tags = [change_event.action.downcase]
357
+ tags << 'bulk_operation' if bulk_operation?(change_event)
358
+ tags << 'api_driven' if api_request?
359
+ tags << 'admin_action' if admin_user?(change_event.user_id)
360
+ tags
361
+ end
362
+ end
363
+ ```
364
+
365
+ #### Grafana Dashboard Ready
366
+
367
+ ```ruby
368
+ class GrafanaProcessor < Whodunit::Chronicles::AuditProcessor
369
+ def build_chronicles_record(change_event)
370
+ {
371
+ # Core metrics for Grafana time series
372
+ timestamp: change_event.occurred_at,
373
+ table_name: change_event.table_name,
374
+ action: change_event.action,
375
+
376
+ # Numerical metrics for graphs
377
+ records_affected: calculate_records_affected(change_event),
378
+ change_magnitude: calculate_change_magnitude(change_event),
379
+ user_session_duration: calculate_session_duration(change_event),
380
+
381
+ # Categorical dimensions for filtering
382
+ environment: Rails.env,
383
+ application_version: app_version,
384
+ database_instance: database_identifier,
385
+
386
+ # Business KPIs
387
+ revenue_impact: calculate_revenue_impact(change_event),
388
+ customer_satisfaction_risk: assess_satisfaction_risk(change_event),
389
+
390
+ # Performance indicators
391
+ query_duration_ms: extract_query_duration(change_event),
392
+ concurrent_users: current_concurrent_users,
393
+ system_load: current_system_load
394
+ }
395
+ end
396
+ end
397
+ ```
398
+
399
+ #### Real-Time Alerts Processor
400
+
401
+ ```ruby
402
+ class AlertingProcessor < Whodunit::Chronicles::AuditProcessor
403
+ def process(change_event)
404
+ record = build_chronicles_record(change_event)
405
+
406
+ # Store the audit record
407
+ store_audit_record(record)
408
+
409
+ # Real-time alerting logic
410
+ send_alert(record) if alert_worthy?(record)
411
+
412
+ # Stream to monitoring systems
413
+ stream_to_datadog(record) if production?
414
+ stream_to_grafana(record)
415
+ end
416
+
164
417
  private
165
418
 
166
- def extract_custom_data(change_event)
167
- # Your custom logic here
419
+ def alert_worthy?(record)
420
+ # Define your alerting criteria
421
+ record[:business_impact] == 'revenue_critical' ||
422
+ record[:records_affected] > 1000 ||
423
+ record[:action] == 'DELETE' && record[:table_name] == 'orders'
424
+ end
425
+
426
+ def stream_to_grafana(record)
427
+ # Send metrics to Grafana via InfluxDB/Prometheus
428
+ InfluxDB.write_point("chronicles_events", record)
168
429
  end
169
430
  end
431
+ ```
170
432
 
171
- # Use custom processor
433
+ #### Multiple Processor Pipeline
434
+
435
+ ```ruby
436
+ # Chain multiple processors for different purposes
172
437
  service = Whodunit::Chronicles::Service.new(
173
- processor: MyCustomProcessor.new
438
+ adapter: Adapters::PostgreSQL.new,
439
+ processor: CompositeProcessor.new([
440
+ AnalyticsProcessor.new, # For business intelligence
441
+ AlertingProcessor.new, # For real-time monitoring
442
+ ComplianceProcessor.new, # For regulatory requirements
443
+ ArchivalProcessor.new # For long-term storage
444
+ ])
174
445
  )
175
446
  ```
176
447
 
448
+ **Use Cases:**
449
+
450
+ - **πŸ“Š Business Intelligence**: Track user behavior patterns, feature adoption, revenue impact
451
+ - **🚨 Real-Time Monitoring**: Alert on suspicious activities, bulk operations, data anomalies
452
+ - **πŸ“ˆ Performance Analytics**: Database performance metrics, query optimization insights
453
+ - **πŸ” Compliance Auditing**: Regulatory compliance, data governance, access patterns
454
+ - **πŸ’‘ Product Analytics**: Feature usage, A/B testing data, user journey tracking
455
+
177
456
  ### Service Monitoring
178
457
 
179
458
  ```ruby
@@ -198,6 +477,80 @@ end
198
477
 
199
478
  ## πŸ§ͺ Testing
200
479
 
480
+ ### Integration Testing
481
+
482
+ Test Chronicles with your Rails application using these patterns:
483
+
484
+ #### Basic Testing Pattern
485
+
486
+ ```ruby
487
+ # Test basic Chronicles functionality
488
+ class ChroniclesIntegrationTest < ActiveSupport::TestCase
489
+ def setup
490
+ @service = Whodunit::Chronicles.service
491
+ @service.setup!
492
+ @service.start
493
+ end
494
+
495
+ def teardown
496
+ @service.stop
497
+ @service.teardown!
498
+ end
499
+
500
+ def test_audit_record_creation
501
+ # Create a user (triggers Whodunit)
502
+ user = User.create!(name: "John", email: "john@example.com")
503
+
504
+ # Wait for Chronicles to process
505
+ sleep 1
506
+
507
+ # Check Chronicles audit record
508
+ audit_record = AuditRecord.find_by(
509
+ table_name: 'users',
510
+ action: 'INSERT',
511
+ record_id: { 'id' => user.id }
512
+ )
513
+
514
+ assert audit_record
515
+ assert_equal 'INSERT', audit_record.action
516
+ assert_equal user.name, audit_record.new_data['name']
517
+ end
518
+ end
519
+ ```
520
+
521
+ #### Advanced Analytics Testing
522
+
523
+ ```ruby
524
+ # Test custom processor functionality
525
+ class RecruitmentAnalyticsTest < ActiveSupport::TestCase
526
+ def setup
527
+ @processor = RecruitmentAnalyticsProcessor.new
528
+ end
529
+
530
+ def test_recruitment_stage_determination
531
+ change_event = create_change_event(
532
+ table_name: 'applications',
533
+ action: 'UPDATE',
534
+ new_data: { 'status' => 'hired' }
535
+ )
536
+
537
+ record = @processor.build_chronicles_record(change_event)
538
+
539
+ assert_equal 'hire', record[:recruitment_stage]
540
+ assert record[:cost_per_hire_impact]
541
+ end
542
+
543
+ def test_metrics_streaming
544
+ # Mock Prometheus and Grafana integrations
545
+ assert_difference 'RECRUITMENT_HIRES_TOTAL.get' do
546
+ @processor.stream_to_prometheus(hired_record)
547
+ end
548
+ end
549
+ end
550
+ ```
551
+
552
+ ### Unit Testing
553
+
201
554
  Chronicles includes comprehensive test coverage:
202
555
 
203
556
  ```bash
@@ -230,14 +583,33 @@ bundle exec brakeman
230
583
 
231
584
  ## 🀝 Contributing
232
585
 
586
+ We welcome contributions! Chronicles is designed to be extensible and work across different business domains.
587
+
233
588
  1. Fork the repository
234
- 2. Create your feature branch (`git checkout -b feature/amazing-feature`)
235
- 3. Make your changes with tests
236
- 4. Ensure tests pass (`bundle exec rake test`)
237
- 5. Ensure RuboCop passes (`bundle exec rubocop`)
589
+ 2. Set up your development environment:
590
+ ```bash
591
+ bundle install
592
+ bundle exec rake test # Ensure tests pass
593
+ ```
594
+ 3. Create your feature branch (`git checkout -b feature/amazing-feature`)
595
+ 4. Make your changes with comprehensive tests
596
+ 5. Test your changes:
597
+ - Unit tests: `bundle exec rake test`
598
+ - Code style: `bundle exec rubocop`
599
+ - Security: `bundle exec bundler-audit check`
238
600
  6. Commit your changes (`git commit -m 'Add amazing feature'`)
239
601
  7. Push to the branch (`git push origin feature/amazing-feature`)
240
- 8. Open a Pull Request
602
+ 8. Open a Pull Request with a detailed description
603
+
604
+ ### Contributing Custom Processors
605
+
606
+ We especially welcome custom processors for different business domains. Consider contributing processors for:
607
+
608
+ - E-commerce analytics (order tracking, inventory management)
609
+ - Financial services (transaction monitoring, compliance reporting)
610
+ - Healthcare (patient data tracking, regulatory compliance)
611
+ - Education (student progress, course analytics)
612
+ - SaaS metrics (user engagement, feature adoption)
241
613
 
242
614
  ## πŸ“‹ Requirements
243
615
 
@@ -246,12 +618,14 @@ bundle exec brakeman
246
618
 
247
619
  ## πŸ—ΊοΈ Roadmap
248
620
 
621
+ - [ ] **Prometheus Metrics**: Production monitoring integration (with complete codebase included in examples/)
622
+ - [ ] **Advanced Example Apps**: Real-world use cases with complete monitoring stack (with complete codebase included in examples/)
623
+ - [ ] **Custom Analytics Processors**: Business intelligence and real-time monitoring (with complete codebase included in examples/)
249
624
  - [ ] **MySQL/MariaDB Support**: MySQL/MariaDB databases binlog streaming adapter
250
625
  - [ ] **Redis Streams**: Alternative lightweight streaming backend
251
626
  - [ ] **Compression**: Optional audit record compression
252
627
  - [ ] **Retention Policies**: Automated audit record cleanup
253
628
  - [ ] **Web UI**: Management interface for monitoring and configuration
254
- - [ ] **Prometheus Metrics**: Production monitoring integration
255
629
 
256
630
  ## πŸ“š Documentation
257
631
 
@@ -2,6 +2,6 @@
2
2
 
3
3
  module Whodunit
4
4
  module Chronicles
5
- VERSION = '0.1.0.pre'
5
+ VERSION = '0.1.0'
6
6
  end
7
7
  end
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: whodunit-chronicles
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.1.0.pre
4
+ version: 0.1.0
5
5
  platform: ruby
6
6
  authors:
7
7
  - Ken C. Demanawa
@@ -250,6 +250,9 @@ files:
250
250
  - LICENSE
251
251
  - README.md
252
252
  - Rakefile
253
+ - examples/images/campaign-performance-analytics.png
254
+ - examples/images/candidate-journey-analytics.png
255
+ - examples/images/recruitment-funnel-analytics.png
253
256
  - lib/.gitkeep
254
257
  - lib/whodunit-chronicles.rb
255
258
  - lib/whodunit/chronicles.rb