journal-cli 1.0.5 → 1.0.9
Sign up to get free protection for your applications and to get access to all the features.
- checksums.yaml +4 -4
- data/CHANGELOG.md +21 -20
- data/README.md +15 -12
- data/bin/journal +23 -9
- data/lib/journal-cli/checkin.rb +1 -1
- data/lib/journal-cli/version.rb +1 -1
- data/src/_README.md +15 -12
- metadata +1 -1
checksums.yaml
CHANGED
@@ -1,7 +1,7 @@
|
|
1
1
|
---
|
2
2
|
SHA256:
|
3
|
-
metadata.gz:
|
4
|
-
data.tar.gz:
|
3
|
+
metadata.gz: bbc1e82ebcb01ece20b52ce550ec1ecbfb73c6b6ca42aed66b370196ab56fd5b
|
4
|
+
data.tar.gz: e1f43b0ff725b58c12b41592acc432f7e81b4beb2fcc894376f81a9538b6adec
|
5
5
|
SHA512:
|
6
|
-
metadata.gz:
|
7
|
-
data.tar.gz:
|
6
|
+
metadata.gz: d369cf8bb7b706d71c0c3b30900e2fd784d8475cf1380f3b19c4b3a1ec36b01cb1e51055dd33ba259247de6929abdd689e936bb6d66c4ba21c99765537d5a5c2
|
7
|
+
data.tar.gz: 9410b47a96eee2376679ecb67d6e9af75323e49feb93993c2288276908685c08a43459184a32889824bd839fdce8bf22353ec7831f4063920b9075029b1c8de9
|
data/CHANGELOG.md
CHANGED
@@ -1,43 +1,44 @@
|
|
1
|
-
### 1.0.
|
1
|
+
### 1.0.9
|
2
2
|
|
3
|
-
2023-09-06
|
3
|
+
2023-09-06 11:58
|
4
|
+
|
5
|
+
#### NEW
|
4
6
|
|
5
|
-
|
7
|
+
- If the second argument is a natural language date, use the parsed result instead of the current time for the entry
|
6
8
|
|
7
|
-
|
9
|
+
### 1.0.8
|
10
|
+
|
11
|
+
2023-09-06 11:56
|
8
12
|
|
9
13
|
#### NEW
|
10
14
|
|
11
|
-
-
|
12
|
-
- Multiple journals, multiple sections
|
15
|
+
- If the second argument is a natural language date, use the parsed result instead of the current time for the entry
|
13
16
|
|
14
|
-
### 1.0.
|
17
|
+
### 1.0.7
|
15
18
|
|
16
|
-
2023-09-06
|
19
|
+
2023-09-06 11:56
|
17
20
|
|
18
21
|
#### NEW
|
19
22
|
|
20
|
-
-
|
21
|
-
- Multiple journals, multiple sections
|
23
|
+
- If the second argument is a natural language date, use the parsed result instead of the current time for the entry
|
22
24
|
|
23
|
-
### 1.0.
|
25
|
+
### 1.0.6
|
24
26
|
|
25
|
-
2023-09-06
|
27
|
+
2023-09-06 11:54
|
26
28
|
|
27
29
|
#### NEW
|
28
30
|
|
29
|
-
-
|
30
|
-
|
31
|
+
- If the second argument is a natural language date, use the parsed result instead of the current time for the entry
|
32
|
+
|
33
|
+
### 1.0.5
|
34
|
+
|
35
|
+
2023-09-06 09:24
|
31
36
|
|
32
|
-
### 1.0.
|
37
|
+
### 1.0.0
|
33
38
|
|
34
|
-
2023-09-06 09:
|
39
|
+
2023-09-06 09:23
|
35
40
|
|
36
41
|
#### NEW
|
37
42
|
|
38
43
|
- Initial journal command
|
39
44
|
- Multiple journals, multiple sections
|
40
|
-
|
41
|
-
### 1.0.0
|
42
|
-
|
43
|
-
2023-09-05 16:46
|
data/README.md
CHANGED
@@ -49,18 +49,19 @@ You can include weather data automatically by setting a question type to 'weathe
|
|
49
49
|
Edit the file at `~/.config/journal/journals.yaml` following this structure:
|
50
50
|
|
51
51
|
```yaml
|
52
|
-
|
53
|
-
|
54
|
-
|
55
|
-
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
|
60
|
-
|
61
|
-
|
62
|
-
|
63
|
-
|
52
|
+
journals:
|
53
|
+
daily: # journal key, will be used on the command line as `journal daily`
|
54
|
+
dayone: true # Enable or disable Day One integration
|
55
|
+
journal: Journal # Day One journal to add to (if using Day One integration)
|
56
|
+
markdown: daily # Type of Markdown file to create, false to skip (can be daily, individual, or digest)
|
57
|
+
title: Daily Journal # Title for every entry, date will be appended where needed
|
58
|
+
sections: # Required key
|
59
|
+
- title: null # The title for the section. If null, no section header will be created
|
60
|
+
key: journal # The key for the data collected, must be one word, alphanumeric characters and _ only
|
61
|
+
questions: # Required key
|
62
|
+
- prompt: How are you feeling? # The question to ask
|
63
|
+
key: journal # alphanumeric characters and _ only, will be nested in section key
|
64
|
+
type: multiline # The type of entry expected (numeric, string, or multiline)
|
64
65
|
```
|
65
66
|
|
66
67
|
Keys must be alphanumeric characters and `_` (underscore) only. Titles and questions can be anything, but if they contain a colon (:), you'll need to quote the string.
|
@@ -121,6 +122,8 @@ A journal must contain a `sections` key, and each section must contain a `questi
|
|
121
122
|
|
122
123
|
Once your configuration file is set up, you can just run `journal JOURNAL_KEY` to begin prompting for the answers to the configured questions.
|
123
124
|
|
125
|
+
If a second argument contains a natural language date, the journal entry will be set to that date instead of the current time. For example, `journal mood "yesterday 5pm"` will create a new entry (in the journal configured for `mood`) for yesterday at 5pm.
|
126
|
+
|
124
127
|
Answers will always be written to `~/.local/share/journal/[KEY].json` (where [KEY] is the journal key, one data file for each journal). If you've specified `daily` or `individual` Markdown formats, entries will be written to Markdown files in `~/.local/share/journal/entries/[KEY]`, either in a `%Y-%m-%d.md` file (daily), or in timestamped individual files. If `digest` is specified for the `markdown` key, a single file will be created at `~/.local/share/journal/[KEY].md`.
|
125
128
|
|
126
129
|
At present there's no tool for querying the dataset created. You just need to parse the JSON and use your language of choice to extract the data. Numeric entries are stored as numbers, and every entry is timestamped, so you should be able to do some advanced analysis once you have enough data.
|
data/bin/journal
CHANGED
@@ -3,13 +3,7 @@
|
|
3
3
|
$LOAD_PATH.unshift File.join(__dir__, '..', 'lib')
|
4
4
|
require 'journal-cli'
|
5
5
|
|
6
|
-
|
7
|
-
|
8
|
-
case ARGV[0]
|
9
|
-
when /(-v|--version)/
|
10
|
-
puts "journal v#{Journal::VERSION}"
|
11
|
-
Process.exit 0
|
12
|
-
when /(help|-h|--help)/
|
6
|
+
def usage
|
13
7
|
puts "journal v#{Journal::VERSION}"
|
14
8
|
puts
|
15
9
|
puts 'Usage: journal [type] [date]'
|
@@ -17,6 +11,20 @@ when /(help|-h|--help)/
|
|
17
11
|
puts 'Available journal types:'
|
18
12
|
config = Journal::Checkin.new
|
19
13
|
puts(config.config['journals'].keys.map { |k| "- #{k}" })
|
14
|
+
end
|
15
|
+
|
16
|
+
if ARGV.count.zero?
|
17
|
+
puts "No journal specified"
|
18
|
+
usage
|
19
|
+
Process.exit 1
|
20
|
+
end
|
21
|
+
|
22
|
+
case ARGV[0]
|
23
|
+
when /(-v|--version)/
|
24
|
+
puts "journal v#{Journal::VERSION}"
|
25
|
+
Process.exit 0
|
26
|
+
when /(help|-h|--help)/
|
27
|
+
usage
|
20
28
|
Process.exit 0
|
21
29
|
end
|
22
30
|
|
@@ -28,5 +36,11 @@ date = if ARGV.length.positive?
|
|
28
36
|
Time.now
|
29
37
|
end
|
30
38
|
checkin = Journal::Checkin.new
|
31
|
-
checkin.
|
32
|
-
checkin.
|
39
|
+
if checkin.config['journals'].key?(journal)
|
40
|
+
checkin.start(journal, date)
|
41
|
+
checkin.go
|
42
|
+
else
|
43
|
+
puts "Journal #{journal} not found"
|
44
|
+
usage
|
45
|
+
Process.exit 1
|
46
|
+
end
|
data/lib/journal-cli/checkin.rb
CHANGED
@@ -98,7 +98,7 @@ module Journal
|
|
98
98
|
cmd << %(-j "#{@journal['journal']}") if @journal.key?('journal')
|
99
99
|
cmd << %(-t #{@journal['tags'].join(' ')}) if @journal.key?('tags')
|
100
100
|
cmd << %(-date "#{@date.strftime('%Y-%m-%d %I:%M %p')}")
|
101
|
-
`echo #{Shellwords.escape(to_markdown)} | #{cmd.join(' ')} -- new`
|
101
|
+
`echo #{Shellwords.escape(to_markdown(yaml: false, title: true))} | #{cmd.join(' ')} -- new`
|
102
102
|
end
|
103
103
|
|
104
104
|
if @journal['markdown']
|
data/lib/journal-cli/version.rb
CHANGED
data/src/_README.md
CHANGED
@@ -53,18 +53,19 @@ You can include weather data automatically by setting a question type to 'weathe
|
|
53
53
|
Edit the file at `~/.config/journal/journals.yaml` following this structure:
|
54
54
|
|
55
55
|
```yaml
|
56
|
-
|
57
|
-
|
58
|
-
|
59
|
-
|
60
|
-
|
61
|
-
|
62
|
-
|
63
|
-
|
64
|
-
|
65
|
-
|
66
|
-
|
67
|
-
|
56
|
+
journals:
|
57
|
+
daily: # journal key, will be used on the command line as `journal daily`
|
58
|
+
dayone: true # Enable or disable Day One integration
|
59
|
+
journal: Journal # Day One journal to add to (if using Day One integration)
|
60
|
+
markdown: daily # Type of Markdown file to create, false to skip (can be daily, individual, or digest)
|
61
|
+
title: Daily Journal # Title for every entry, date will be appended where needed
|
62
|
+
sections: # Required key
|
63
|
+
- title: null # The title for the section. If null, no section header will be created
|
64
|
+
key: journal # The key for the data collected, must be one word, alphanumeric characters and _ only
|
65
|
+
questions: # Required key
|
66
|
+
- prompt: How are you feeling? # The question to ask
|
67
|
+
key: journal # alphanumeric characters and _ only, will be nested in section key
|
68
|
+
type: multiline # The type of entry expected (numeric, string, or multiline)
|
68
69
|
```
|
69
70
|
|
70
71
|
Keys must be alphanumeric characters and `_` (underscore) only. Titles and questions can be anything, but if they contain a colon (:), you'll need to quote the string.
|
@@ -125,6 +126,8 @@ A journal must contain a `sections` key, and each section must contain a `questi
|
|
125
126
|
|
126
127
|
Once your configuration file is set up, you can just run `journal JOURNAL_KEY` to begin prompting for the answers to the configured questions.
|
127
128
|
|
129
|
+
If a second argument contains a natural language date, the journal entry will be set to that date instead of the current time. For example, `journal mood "yesterday 5pm"` will create a new entry (in the journal configured for `mood`) for yesterday at 5pm.
|
130
|
+
|
128
131
|
Answers will always be written to `~/.local/share/journal/[KEY].json` (where [KEY] is the journal key, one data file for each journal). If you've specified `daily` or `individual` Markdown formats, entries will be written to Markdown files in `~/.local/share/journal/entries/[KEY]`, either in a `%Y-%m-%d.md` file (daily), or in timestamped individual files. If `digest` is specified for the `markdown` key, a single file will be created at `~/.local/share/journal/[KEY].md`.
|
129
132
|
|
130
133
|
At present there's no tool for querying the dataset created. You just need to parse the JSON and use your language of choice to extract the data. Numeric entries are stored as numbers, and every entry is timestamped, so you should be able to do some advanced analysis once you have enough data.
|