@wentorai/research-plugins 1.3.2 โ†’ 1.4.2

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (266) hide show
  1. package/README.md +32 -56
  2. package/curated/analysis/README.md +1 -13
  3. package/curated/domains/README.md +1 -5
  4. package/curated/literature/README.md +3 -12
  5. package/curated/research/README.md +1 -18
  6. package/curated/tools/README.md +1 -12
  7. package/curated/writing/README.md +2 -6
  8. package/index.ts +88 -5
  9. package/openclaw.plugin.json +3 -12
  10. package/package.json +3 -5
  11. package/skills/analysis/statistics/SKILL.md +1 -1
  12. package/skills/analysis/statistics/meta-analysis-guide/SKILL.md +1 -1
  13. package/skills/domains/ai-ml/SKILL.md +3 -2
  14. package/skills/domains/ai-ml/generative-ai-guide/SKILL.md +1 -0
  15. package/skills/domains/ai-ml/huggingface-api/SKILL.md +251 -0
  16. package/skills/domains/biomedical/SKILL.md +9 -2
  17. package/skills/domains/biomedical/alphafold-api/SKILL.md +227 -0
  18. package/skills/domains/biomedical/biothings-api/SKILL.md +296 -0
  19. package/skills/domains/biomedical/clinicaltrials-api-v2/SKILL.md +216 -0
  20. package/skills/domains/biomedical/enrichr-api/SKILL.md +264 -0
  21. package/skills/domains/biomedical/ensembl-rest-api/SKILL.md +204 -0
  22. package/skills/domains/biomedical/medical-data-api/SKILL.md +197 -0
  23. package/skills/domains/biomedical/pdb-structure-api/SKILL.md +219 -0
  24. package/skills/domains/business/SKILL.md +2 -3
  25. package/skills/domains/chemistry/SKILL.md +3 -2
  26. package/skills/domains/chemistry/catalysis-hub-api/SKILL.md +171 -0
  27. package/skills/domains/education/SKILL.md +2 -3
  28. package/skills/domains/law/SKILL.md +3 -2
  29. package/skills/domains/law/uk-legislation-api/SKILL.md +179 -0
  30. package/skills/literature/discovery/SKILL.md +1 -1
  31. package/skills/literature/discovery/citation-alert-guide/SKILL.md +2 -2
  32. package/skills/literature/discovery/conference-proceedings-guide/SKILL.md +2 -2
  33. package/skills/literature/discovery/literature-mapping-guide/SKILL.md +1 -1
  34. package/skills/literature/discovery/paper-recommendation-guide/SKILL.md +8 -14
  35. package/skills/literature/discovery/rss-paper-feeds/SKILL.md +20 -14
  36. package/skills/literature/discovery/semantic-paper-radar/SKILL.md +8 -8
  37. package/skills/literature/discovery/semantic-scholar-recs-guide/SKILL.md +103 -86
  38. package/skills/literature/fulltext/SKILL.md +3 -2
  39. package/skills/literature/fulltext/arxiv-latex-source/SKILL.md +195 -0
  40. package/skills/literature/fulltext/open-access-guide/SKILL.md +1 -1
  41. package/skills/literature/fulltext/open-access-mining-guide/SKILL.md +5 -5
  42. package/skills/literature/metadata/citation-network-guide/SKILL.md +3 -3
  43. package/skills/literature/metadata/h-index-guide/SKILL.md +0 -27
  44. package/skills/literature/search/SKILL.md +3 -4
  45. package/skills/literature/search/citation-chaining-guide/SKILL.md +42 -32
  46. package/skills/literature/search/database-comparison-guide/SKILL.md +1 -1
  47. package/skills/literature/search/semantic-scholar-api/SKILL.md +56 -53
  48. package/skills/research/automation/SKILL.md +2 -3
  49. package/skills/research/automation/datagen-research-guide/SKILL.md +1 -0
  50. package/skills/research/automation/mle-agent-guide/SKILL.md +1 -0
  51. package/skills/research/automation/paper-to-agent-guide/SKILL.md +2 -1
  52. package/skills/research/deep-research/auto-deep-research-guide/SKILL.md +1 -0
  53. package/skills/research/deep-research/in-depth-research-guide/SKILL.md +1 -1
  54. package/skills/research/deep-research/kosmos-scientist-guide/SKILL.md +3 -3
  55. package/skills/research/deep-research/llm-scientific-discovery-guide/SKILL.md +1 -1
  56. package/skills/research/deep-research/local-deep-research-guide/SKILL.md +6 -6
  57. package/skills/research/deep-research/open-researcher-guide/SKILL.md +3 -3
  58. package/skills/research/deep-research/tongyi-deep-research-guide/SKILL.md +4 -4
  59. package/skills/research/methodology/SKILL.md +1 -1
  60. package/skills/research/methodology/claude-scientific-guide/SKILL.md +1 -0
  61. package/skills/research/methodology/grad-school-guide/SKILL.md +1 -1
  62. package/skills/research/methodology/qualitative-research-guide/SKILL.md +1 -1
  63. package/skills/research/paper-review/SKILL.md +1 -1
  64. package/skills/research/paper-review/automated-review-guide/SKILL.md +1 -1
  65. package/skills/research/paper-review/peer-review-guide/SKILL.md +1 -1
  66. package/skills/tools/diagram/excalidraw-diagram-guide/SKILL.md +1 -1
  67. package/skills/tools/diagram/mermaid-architect-guide/SKILL.md +1 -1
  68. package/skills/tools/diagram/plantuml-guide/SKILL.md +1 -1
  69. package/skills/tools/document/grobid-pdf-parsing/SKILL.md +1 -1
  70. package/skills/tools/document/paper-parse-guide/SKILL.md +2 -2
  71. package/skills/tools/knowledge-graph/SKILL.md +2 -3
  72. package/skills/tools/knowledge-graph/citation-network-builder/SKILL.md +5 -5
  73. package/skills/tools/knowledge-graph/knowledge-graph-construction/SKILL.md +1 -1
  74. package/skills/tools/ocr-translate/zotero-pdf2zh-guide/SKILL.md +1 -0
  75. package/skills/tools/scraping/academic-web-scraping/SKILL.md +1 -2
  76. package/skills/tools/scraping/google-scholar-scraper/SKILL.md +7 -7
  77. package/skills/writing/citation/SKILL.md +1 -1
  78. package/skills/writing/citation/academic-citation-manager/SKILL.md +20 -17
  79. package/skills/writing/citation/citation-assistant-skill/SKILL.md +72 -58
  80. package/skills/writing/citation/obsidian-citation-guide/SKILL.md +1 -0
  81. package/skills/writing/citation/obsidian-zotero-guide/SKILL.md +1 -0
  82. package/skills/writing/citation/onecite-reference-guide/SKILL.md +1 -1
  83. package/skills/writing/citation/papersgpt-zotero-guide/SKILL.md +1 -0
  84. package/skills/writing/citation/zotero-mdnotes-guide/SKILL.md +1 -0
  85. package/skills/writing/citation/zotero-reference-guide/SKILL.md +2 -1
  86. package/skills/writing/citation/zotero-scholar-guide/SKILL.md +1 -1
  87. package/skills/writing/composition/scientific-writing-resources/SKILL.md +1 -0
  88. package/skills/writing/latex/latex-drawing-collection/SKILL.md +1 -0
  89. package/skills/writing/latex/latex-templates-collection/SKILL.md +1 -0
  90. package/skills/writing/templates/novathesis-guide/SKILL.md +1 -0
  91. package/src/tools/arxiv.ts +81 -30
  92. package/src/tools/biorxiv.ts +158 -0
  93. package/src/tools/crossref.ts +63 -22
  94. package/src/tools/datacite.ts +191 -0
  95. package/src/tools/dblp.ts +125 -0
  96. package/src/tools/doaj.ts +82 -0
  97. package/src/tools/europe-pmc.ts +159 -0
  98. package/src/tools/hal.ts +118 -0
  99. package/src/tools/inspire-hep.ts +165 -0
  100. package/src/tools/openaire.ts +158 -0
  101. package/src/tools/openalex.ts +26 -15
  102. package/src/tools/opencitations.ts +112 -0
  103. package/src/tools/orcid.ts +139 -0
  104. package/src/tools/osf-preprints.ts +104 -0
  105. package/src/tools/pubmed.ts +22 -13
  106. package/src/tools/ror.ts +118 -0
  107. package/src/tools/unpaywall.ts +15 -6
  108. package/src/tools/util.ts +141 -0
  109. package/src/tools/zenodo.ts +157 -0
  110. package/mcp-configs/academic-db/ChatSpatial.json +0 -17
  111. package/mcp-configs/academic-db/academia-mcp.json +0 -17
  112. package/mcp-configs/academic-db/academic-paper-explorer.json +0 -17
  113. package/mcp-configs/academic-db/academic-search-mcp-server.json +0 -17
  114. package/mcp-configs/academic-db/agentinterviews-mcp.json +0 -17
  115. package/mcp-configs/academic-db/all-in-mcp.json +0 -17
  116. package/mcp-configs/academic-db/alphafold-mcp.json +0 -20
  117. package/mcp-configs/academic-db/apple-health-mcp.json +0 -17
  118. package/mcp-configs/academic-db/arxiv-latex-mcp.json +0 -17
  119. package/mcp-configs/academic-db/arxiv-mcp-server.json +0 -17
  120. package/mcp-configs/academic-db/bgpt-mcp.json +0 -17
  121. package/mcp-configs/academic-db/biomcp.json +0 -17
  122. package/mcp-configs/academic-db/biothings-mcp.json +0 -17
  123. package/mcp-configs/academic-db/brightspace-mcp.json +0 -21
  124. package/mcp-configs/academic-db/catalysishub-mcp-server.json +0 -17
  125. package/mcp-configs/academic-db/climatiq-mcp.json +0 -20
  126. package/mcp-configs/academic-db/clinicaltrialsgov-mcp-server.json +0 -17
  127. package/mcp-configs/academic-db/deep-research-mcp.json +0 -17
  128. package/mcp-configs/academic-db/dicom-mcp.json +0 -17
  129. package/mcp-configs/academic-db/enrichr-mcp-server.json +0 -17
  130. package/mcp-configs/academic-db/fec-mcp-server.json +0 -17
  131. package/mcp-configs/academic-db/fhir-mcp-server-themomentum.json +0 -17
  132. package/mcp-configs/academic-db/fhir-mcp.json +0 -19
  133. package/mcp-configs/academic-db/gget-mcp.json +0 -17
  134. package/mcp-configs/academic-db/gibs-mcp.json +0 -20
  135. package/mcp-configs/academic-db/gis-mcp-server.json +0 -22
  136. package/mcp-configs/academic-db/google-earth-engine-mcp.json +0 -21
  137. package/mcp-configs/academic-db/google-researcher-mcp.json +0 -17
  138. package/mcp-configs/academic-db/idea-reality-mcp.json +0 -17
  139. package/mcp-configs/academic-db/legiscan-mcp.json +0 -19
  140. package/mcp-configs/academic-db/lex.json +0 -17
  141. package/mcp-configs/academic-db/m4-clinical-mcp.json +0 -21
  142. package/mcp-configs/academic-db/medical-mcp.json +0 -21
  143. package/mcp-configs/academic-db/nexonco-mcp.json +0 -20
  144. package/mcp-configs/academic-db/omop-mcp.json +0 -20
  145. package/mcp-configs/academic-db/onekgpd-mcp.json +0 -20
  146. package/mcp-configs/academic-db/openedu-mcp.json +0 -20
  147. package/mcp-configs/academic-db/opengenes-mcp.json +0 -20
  148. package/mcp-configs/academic-db/openstax-mcp.json +0 -21
  149. package/mcp-configs/academic-db/openstreetmap-mcp.json +0 -21
  150. package/mcp-configs/academic-db/opentargets-mcp.json +0 -21
  151. package/mcp-configs/academic-db/pdb-mcp.json +0 -21
  152. package/mcp-configs/academic-db/smithsonian-mcp.json +0 -20
  153. package/mcp-configs/ai-platform/Adaptive-Graph-of-Thoughts-MCP-server.json +0 -17
  154. package/mcp-configs/ai-platform/ai-counsel.json +0 -17
  155. package/mcp-configs/ai-platform/atlas-mcp-server.json +0 -17
  156. package/mcp-configs/ai-platform/counsel-mcp.json +0 -17
  157. package/mcp-configs/ai-platform/cross-llm-mcp.json +0 -17
  158. package/mcp-configs/ai-platform/gptr-mcp.json +0 -17
  159. package/mcp-configs/ai-platform/magi-researchers.json +0 -21
  160. package/mcp-configs/ai-platform/mcp-academic-researcher.json +0 -22
  161. package/mcp-configs/ai-platform/open-paper-machine.json +0 -21
  162. package/mcp-configs/ai-platform/paper-intelligence.json +0 -21
  163. package/mcp-configs/ai-platform/paper-reader.json +0 -21
  164. package/mcp-configs/ai-platform/paperdebugger.json +0 -21
  165. package/mcp-configs/browser/decipher-research-agent.json +0 -17
  166. package/mcp-configs/browser/deep-research.json +0 -17
  167. package/mcp-configs/browser/everything-claude-code.json +0 -17
  168. package/mcp-configs/browser/exa-mcp.json +0 -20
  169. package/mcp-configs/browser/gpt-researcher.json +0 -17
  170. package/mcp-configs/browser/heurist-agent-framework.json +0 -17
  171. package/mcp-configs/browser/mcp-searxng.json +0 -21
  172. package/mcp-configs/browser/mcp-webresearch.json +0 -20
  173. package/mcp-configs/cloud-docs/confluence-mcp.json +0 -37
  174. package/mcp-configs/cloud-docs/google-drive-mcp.json +0 -35
  175. package/mcp-configs/cloud-docs/notion-mcp.json +0 -29
  176. package/mcp-configs/communication/discord-mcp.json +0 -29
  177. package/mcp-configs/communication/discourse-mcp.json +0 -21
  178. package/mcp-configs/communication/slack-mcp.json +0 -29
  179. package/mcp-configs/communication/telegram-mcp.json +0 -28
  180. package/mcp-configs/data-platform/4everland-hosting-mcp.json +0 -17
  181. package/mcp-configs/data-platform/automl-stat-mcp.json +0 -21
  182. package/mcp-configs/data-platform/context-keeper.json +0 -17
  183. package/mcp-configs/data-platform/context7.json +0 -19
  184. package/mcp-configs/data-platform/contextstream-mcp.json +0 -17
  185. package/mcp-configs/data-platform/email-mcp.json +0 -17
  186. package/mcp-configs/data-platform/jefferson-stats-mcp.json +0 -22
  187. package/mcp-configs/data-platform/mcp-excel-server.json +0 -21
  188. package/mcp-configs/data-platform/mcp-stata.json +0 -21
  189. package/mcp-configs/data-platform/mcpstack-jupyter.json +0 -21
  190. package/mcp-configs/data-platform/ml-mcp.json +0 -21
  191. package/mcp-configs/data-platform/nasdaq-data-link-mcp.json +0 -20
  192. package/mcp-configs/data-platform/numpy-mcp.json +0 -21
  193. package/mcp-configs/database/neo4j-mcp.json +0 -37
  194. package/mcp-configs/database/postgres-mcp.json +0 -28
  195. package/mcp-configs/database/sqlite-mcp.json +0 -29
  196. package/mcp-configs/dev-platform/geogebra-mcp.json +0 -21
  197. package/mcp-configs/dev-platform/github-mcp.json +0 -31
  198. package/mcp-configs/dev-platform/gitlab-mcp.json +0 -34
  199. package/mcp-configs/dev-platform/latex-mcp-server.json +0 -21
  200. package/mcp-configs/dev-platform/manim-mcp.json +0 -20
  201. package/mcp-configs/dev-platform/mcp-echarts.json +0 -20
  202. package/mcp-configs/dev-platform/panel-viz-mcp.json +0 -20
  203. package/mcp-configs/dev-platform/paperbanana.json +0 -20
  204. package/mcp-configs/dev-platform/texflow-mcp.json +0 -20
  205. package/mcp-configs/dev-platform/texmcp.json +0 -20
  206. package/mcp-configs/dev-platform/typst-mcp.json +0 -21
  207. package/mcp-configs/dev-platform/vizro-mcp.json +0 -20
  208. package/mcp-configs/email/email-mcp.json +0 -40
  209. package/mcp-configs/email/gmail-mcp.json +0 -37
  210. package/mcp-configs/note-knowledge/ApeRAG.json +0 -17
  211. package/mcp-configs/note-knowledge/In-Memoria.json +0 -17
  212. package/mcp-configs/note-knowledge/agent-memory.json +0 -17
  213. package/mcp-configs/note-knowledge/aimemo.json +0 -17
  214. package/mcp-configs/note-knowledge/biel-mcp.json +0 -19
  215. package/mcp-configs/note-knowledge/cognee.json +0 -17
  216. package/mcp-configs/note-knowledge/context-awesome.json +0 -17
  217. package/mcp-configs/note-knowledge/context-mcp.json +0 -17
  218. package/mcp-configs/note-knowledge/conversation-handoff-mcp.json +0 -17
  219. package/mcp-configs/note-knowledge/cortex.json +0 -17
  220. package/mcp-configs/note-knowledge/devrag.json +0 -17
  221. package/mcp-configs/note-knowledge/easy-obsidian-mcp.json +0 -17
  222. package/mcp-configs/note-knowledge/engram.json +0 -17
  223. package/mcp-configs/note-knowledge/gnosis-mcp.json +0 -17
  224. package/mcp-configs/note-knowledge/graphlit-mcp-server.json +0 -19
  225. package/mcp-configs/note-knowledge/local-faiss-mcp.json +0 -21
  226. package/mcp-configs/note-knowledge/mcp-memory-service.json +0 -21
  227. package/mcp-configs/note-knowledge/mcp-obsidian.json +0 -23
  228. package/mcp-configs/note-knowledge/mcp-ragdocs.json +0 -20
  229. package/mcp-configs/note-knowledge/mcp-summarizer.json +0 -21
  230. package/mcp-configs/note-knowledge/mediawiki-mcp.json +0 -21
  231. package/mcp-configs/note-knowledge/openzim-mcp.json +0 -20
  232. package/mcp-configs/note-knowledge/zettelkasten-mcp.json +0 -21
  233. package/mcp-configs/reference-mgr/academic-paper-mcp-http.json +0 -20
  234. package/mcp-configs/reference-mgr/academix.json +0 -20
  235. package/mcp-configs/reference-mgr/arxiv-cli.json +0 -17
  236. package/mcp-configs/reference-mgr/arxiv-research-mcp.json +0 -21
  237. package/mcp-configs/reference-mgr/arxiv-search-mcp.json +0 -17
  238. package/mcp-configs/reference-mgr/chiken.json +0 -17
  239. package/mcp-configs/reference-mgr/claude-scholar.json +0 -17
  240. package/mcp-configs/reference-mgr/devonthink-mcp.json +0 -17
  241. package/mcp-configs/reference-mgr/google-scholar-abstract-mcp.json +0 -19
  242. package/mcp-configs/reference-mgr/google-scholar-mcp.json +0 -20
  243. package/mcp-configs/reference-mgr/mcp-paperswithcode.json +0 -21
  244. package/mcp-configs/reference-mgr/mcp-scholarly.json +0 -20
  245. package/mcp-configs/reference-mgr/mcp-simple-arxiv.json +0 -20
  246. package/mcp-configs/reference-mgr/mcp-simple-pubmed.json +0 -20
  247. package/mcp-configs/reference-mgr/mcp-zotero.json +0 -21
  248. package/mcp-configs/reference-mgr/mendeley-mcp.json +0 -20
  249. package/mcp-configs/reference-mgr/ncbi-mcp-server.json +0 -22
  250. package/mcp-configs/reference-mgr/onecite.json +0 -21
  251. package/mcp-configs/reference-mgr/paper-search-mcp.json +0 -21
  252. package/mcp-configs/reference-mgr/pubmed-search-mcp.json +0 -21
  253. package/mcp-configs/reference-mgr/scholar-mcp.json +0 -21
  254. package/mcp-configs/reference-mgr/scholar-multi-mcp.json +0 -21
  255. package/mcp-configs/reference-mgr/seerai.json +0 -21
  256. package/mcp-configs/reference-mgr/semantic-scholar-fastmcp.json +0 -21
  257. package/mcp-configs/reference-mgr/sourcelibrary.json +0 -20
  258. package/mcp-configs/registry.json +0 -476
  259. package/mcp-configs/repository/dataverse-mcp.json +0 -33
  260. package/mcp-configs/repository/huggingface-mcp.json +0 -29
  261. package/skills/domains/business/xpert-bi-guide/SKILL.md +0 -84
  262. package/skills/domains/education/edumcp-guide/SKILL.md +0 -74
  263. package/skills/literature/search/paper-search-mcp-guide/SKILL.md +0 -107
  264. package/skills/research/automation/mcp-server-guide/SKILL.md +0 -211
  265. package/skills/tools/knowledge-graph/paperpile-notion-guide/SKILL.md +0 -84
  266. package/src/tools/semantic-scholar.ts +0 -66
@@ -40,24 +40,30 @@ Examine the reference list of each seed paper and identify which cited works are
40
40
  ```python
41
41
  import requests
42
42
 
43
- def get_references(paper_id, limit=100):
44
- """Get all references of a paper via Semantic Scholar."""
45
- url = f"https://api.semanticscholar.org/graph/v1/paper/{paper_id}/references"
46
- response = requests.get(url, params={
47
- "fields": "title,year,citationCount,externalIds,abstract",
48
- "limit": limit
49
- })
50
- refs = response.json().get("data", [])
51
- return [r["citedPaper"] for r in refs if r["citedPaper"].get("title")]
43
+ HEADERS = {"User-Agent": "ResearchPlugins/1.0 (https://wentor.ai)"}
44
+
45
+ def get_references(work_id):
46
+ """Get all references of a paper via OpenAlex."""
47
+ url = f"https://api.openalex.org/works/{work_id}"
48
+ response = requests.get(url, headers=HEADERS)
49
+ paper = response.json()
50
+ ref_ids = paper.get("referenced_works", [])
51
+
52
+ references = []
53
+ for ref_id in ref_ids:
54
+ ref = requests.get(f"https://api.openalex.org/works/{ref_id.split('/')[-1]}", headers=HEADERS).json()
55
+ if ref.get("title"):
56
+ references.append(ref)
57
+ return references
52
58
 
53
59
  # Get references of a seed paper
54
- seed_doi = "DOI:10.1038/s41586-021-03819-2"
55
- references = get_references(seed_doi)
60
+ seed_id = "W2741809807"
61
+ references = get_references(seed_id)
56
62
 
57
63
  # Sort by citation count to find the most influential foundations
58
- references.sort(key=lambda p: p.get("citationCount", 0), reverse=True)
64
+ references.sort(key=lambda p: p.get("cited_by_count", 0), reverse=True)
59
65
  for ref in references[:15]:
60
- print(f"[{ref.get('year', '?')}] {ref['title']} ({ref.get('citationCount', 0)} citations)")
66
+ print(f"[{ref.get('publication_year', '?')}] {ref['title']} ({ref.get('cited_by_count', 0)} citations)")
61
67
  ```
62
68
 
63
69
  ### Step 3: Forward Chaining (Citation Tracking)
@@ -65,28 +71,32 @@ for ref in references[:15]:
65
71
  Find all papers that have cited your seed paper.
66
72
 
67
73
  ```python
68
- def get_citations(paper_id, limit=200):
69
- """Get papers citing a given paper via Semantic Scholar."""
70
- url = f"https://api.semanticscholar.org/graph/v1/paper/{paper_id}/citations"
74
+ def get_citations(work_id, limit=200):
75
+ """Get papers citing a given paper via OpenAlex."""
71
76
  all_citations = []
72
- offset = 0
73
- while offset < limit:
74
- response = requests.get(url, params={
75
- "fields": "title,year,citationCount,externalIds,abstract",
76
- "limit": min(100, limit - offset),
77
- "offset": offset
78
- })
79
- data = response.json().get("data", [])
80
- if not data:
77
+ page = 1
78
+ while len(all_citations) < limit:
79
+ response = requests.get(
80
+ "https://api.openalex.org/works",
81
+ params={
82
+ "filter": f"cites:{work_id}",
83
+ "sort": "cited_by_count:desc",
84
+ "per_page": min(200, limit - len(all_citations)),
85
+ "page": page
86
+ },
87
+ headers=HEADERS
88
+ )
89
+ results = response.json().get("results", [])
90
+ if not results:
81
91
  break
82
- all_citations.extend([c["citingPaper"] for c in data if c["citingPaper"].get("title")])
83
- offset += len(data)
92
+ all_citations.extend(results)
93
+ page += 1
84
94
  return all_citations
85
95
 
86
- citations = get_citations(seed_doi)
96
+ citations = get_citations(seed_id)
87
97
  # Filter for recent, well-cited papers
88
- recent_impactful = [c for c in citations if c.get("year", 0) >= 2022 and c.get("citationCount", 0) >= 5]
89
- recent_impactful.sort(key=lambda p: p.get("citationCount", 0), reverse=True)
98
+ recent_impactful = [c for c in citations if c.get("publication_year", 0) >= 2022 and c.get("cited_by_count", 0) >= 5]
99
+ recent_impactful.sort(key=lambda p: p.get("cited_by_count", 0), reverse=True)
90
100
  ```
91
101
 
92
102
  ### Step 4: Co-Citation and Bibliographic Coupling
@@ -134,7 +144,7 @@ Repeat the process with the most relevant papers discovered in each round:
134
144
  | Google Scholar "Cited by" | Forward chaining | Free |
135
145
  | Web of Science "Cited References" / "Times Cited" | Both directions | Subscription |
136
146
  | Scopus "References" / "Cited by" | Both directions | Subscription |
137
- | Semantic Scholar API | Programmatic, both directions | Free |
147
+ | OpenAlex API | Programmatic, both directions | Free |
138
148
  | Connected Papers (connectedpapers.com) | Visual co-citation graph | Free (limited) |
139
149
  | Litmaps (litmaps.com) | Visual citation network | Free tier |
140
150
  | CoCites (cocites.com) | Co-citation analysis | Free |
@@ -145,4 +155,4 @@ Repeat the process with the most relevant papers discovered in each round:
145
155
  - **Citation bias**: Highly cited papers are not always the best or most relevant. Pay attention to less-cited but methodologically sound papers.
146
156
  - **Recency bias**: Forward chaining favors recent papers with fewer citations. Allow time for citation accumulation or use Mendeley readership as a proxy.
147
157
  - **Field boundaries**: Citation chains tend to stay within disciplinary silos. Combine with keyword searches in adjacent-field databases to break out.
148
- - **Incomplete coverage**: No single database indexes all citations. Cross-check with at least two sources (e.g., Semantic Scholar + Google Scholar).
158
+ - **Incomplete coverage**: No single database indexes all citations. Cross-check with at least two sources (e.g., OpenAlex + Google Scholar).
@@ -96,5 +96,5 @@ A robust literature search should query multiple databases to maximize recall:
96
96
 
97
97
  - **Scopus vs. Web of Science**: Scopus has broader coverage (especially post-2000 and non-English journals); WoS has deeper historical archives and the Journal Impact Factor.
98
98
  - **Google Scholar** finds the most results but lacks advanced filtering. Use it for snowball searches and finding grey literature, not as your primary systematic search tool.
99
- - **API access**: PubMed (E-utilities), Semantic Scholar, OpenAlex, and Crossref all offer free APIs for programmatic searching. Scopus and WoS require institutional API keys.
99
+ - **API access**: PubMed (E-utilities), OpenAlex, and Crossref all offer free APIs for programmatic searching. Scopus and WoS require institutional API keys.
100
100
  - **Alert services**: Set up saved search alerts on PubMed, Scopus, and Google Scholar to stay current in fast-moving fields.
@@ -1,134 +1,137 @@
1
1
  ---
2
2
  name: semantic-scholar-api
3
- description: "Search papers and analyze citation graphs via Semantic Scholar"
3
+ description: "Search papers and analyze citation graphs via OpenAlex and CrossRef APIs"
4
4
  metadata:
5
5
  openclaw:
6
6
  emoji: "๐Ÿ”"
7
7
  category: "literature"
8
8
  subcategory: "search"
9
9
  keywords: ["academic database search", "semantic search", "AI-powered literature search", "citation analysis", "citation network"]
10
- source: "https://api.semanticscholar.org/"
10
+ source: "https://api.openalex.org/"
11
11
  ---
12
12
 
13
- # Semantic Scholar API Guide
13
+ # OpenAlex & CrossRef API Guide
14
14
 
15
15
  ## Overview
16
16
 
17
- Semantic Scholar is a free, AI-powered research tool created by the Allen Institute for AI (AI2) that indexes over 200 million academic papers across all fields of science. Unlike traditional keyword-based search engines, Semantic Scholar uses natural language processing and machine learning to understand paper content, identify influential citations, and surface the most relevant results.
17
+ OpenAlex is a free, open catalog of the global research system, indexing over 250 million academic works across all fields of science. It provides structured access to papers, authors, institutions, concepts, and citation networks. OpenAlex is the successor to Microsoft Academic Graph and is maintained by OurResearch (the team behind Unpaywall).
18
18
 
19
- The Semantic Scholar Academic Graph API provides structured access to papers, authors, citations, and references. It distinguishes between influential and non-influential citations using a trained classifier, helping researchers quickly identify the most impactful works in any field. The API also provides TLDR summaries generated by AI for many papers.
19
+ CrossRef is the official DOI registration agency for scholarly content, providing metadata for over 150 million DOIs across all publishers and disciplines. Together, OpenAlex and CrossRef provide comprehensive coverage for academic search, citation analysis, and bibliometric research.
20
20
 
21
- The API can be used without authentication for basic access. Registering for a free API key unlocks higher rate limits and is recommended for production applications. The API returns clean JSON responses and supports field selection to minimize response payload size.
21
+ Both APIs are free to use without authentication. OpenAlex requests a polite `User-Agent` header; CrossRef requests a `User-Agent` with contact email for access to the polite pool (faster rate limits).
22
22
 
23
23
  ## Authentication
24
24
 
25
- No authentication is required for basic usage. For higher rate limits, request a free API key at https://www.semanticscholar.org/product/api and include it as a header:
25
+ No authentication is required for either API.
26
26
 
27
+ OpenAlex: Include a `User-Agent` header for polite access:
27
28
  ```
28
- x-api-key: YOUR_API_KEY
29
+ User-Agent: ResearchPlugins/1.0 (https://wentor.ai)
29
30
  ```
30
31
 
31
- Without an API key, rate limits are 5,000 requests per 5 minutes. With a key, limits are significantly higher (up to 1 request per second sustained).
32
+ CrossRef: Include a `User-Agent` header with contact email for polite pool:
33
+ ```
34
+ User-Agent: ResearchPlugins/1.0 (https://wentor.ai; mailto:dev@wentor.ai)
35
+ ```
32
36
 
33
37
  ## Core Endpoints
34
38
 
35
- ### Paper Search: Find Papers by Query
39
+ ### OpenAlex: Search Works
36
40
 
37
- - **URL**: `GET https://api.semanticscholar.org/graph/v1/paper/search`
41
+ - **URL**: `GET https://api.openalex.org/works`
38
42
  - **Parameters**:
39
43
  | Param | Type | Required | Description |
40
44
  |-------|------|----------|-------------|
41
- | query | string | Yes | Search query string |
42
- | offset | integer | No | Pagination offset (default: 0) |
43
- | limit | integer | No | Results per page (default: 10, max: 100) |
44
- | fields | string | No | Comma-separated fields to return (e.g., title,abstract,year,citationCount) |
45
- | year | string | No | Year range filter (e.g., 2020-2024 or 2024-) |
46
- | fieldsOfStudy | string | No | Filter by field (e.g., Computer Science, Medicine) |
45
+ | search | string | No | Full-text search query |
46
+ | filter | string | No | Filter expression (e.g., `from_publication_date:2024-01-01`) |
47
+ | sort | string | No | Sort field (e.g., `cited_by_count:desc`, `publication_date:desc`) |
48
+ | per_page | integer | No | Results per page (default: 25, max: 200) |
49
+ | page | integer | No | Page number (default: 1) |
47
50
  - **Example**:
48
51
  ```bash
49
- curl "https://api.semanticscholar.org/graph/v1/paper/search?query=attention+is+all+you+need&limit=5&fields=title,year,citationCount,authors,tldr"
52
+ curl "https://api.openalex.org/works?search=attention+is+all+you+need&per_page=5"
50
53
  ```
51
- - **Response**: JSON with `total`, `offset`, and `data` array containing paper objects with requested fields.
54
+ - **Response**: JSON with `meta` (count, page info) and `results` array containing work objects.
52
55
 
53
- ### Paper Details: Retrieve Full Paper Metadata
56
+ ### OpenAlex: Get Work Details
54
57
 
55
- - **URL**: `GET https://api.semanticscholar.org/graph/v1/paper/{paper_id}`
58
+ - **URL**: `GET https://api.openalex.org/works/{id}`
56
59
  - **Parameters**:
57
60
  | Param | Type | Required | Description |
58
61
  |-------|------|----------|-------------|
59
- | paper_id | string | Yes | Semantic Scholar ID, DOI, ArXiv ID, or other identifier (e.g., DOI:10.1234/...) |
60
- | fields | string | No | Comma-separated fields to return |
62
+ | id | string | Yes | OpenAlex ID (e.g., `W2741809807`), DOI URL, or other identifier |
61
63
  - **Example**:
62
64
  ```bash
63
- curl "https://api.semanticscholar.org/graph/v1/paper/DOI:10.18653/v1/N19-1423?fields=title,abstract,year,citationCount,influentialCitationCount,references,citations"
65
+ curl "https://api.openalex.org/works/W2741809807"
64
66
  ```
65
- - **Response**: JSON with full paper metadata including `paperId`, `title`, `abstract`, `year`, `citationCount`, `influentialCitationCount`, `references`, and `citations`.
67
+ - **Response**: JSON with full work metadata including `id`, `title`, `abstract_inverted_index`, `publication_year`, `cited_by_count`, `authorships`, `concepts`, `referenced_works`.
66
68
 
67
- ### Author Search: Find Researchers
69
+ ### OpenAlex: Search Authors
68
70
 
69
- - **URL**: `GET https://api.semanticscholar.org/graph/v1/author/search`
71
+ - **URL**: `GET https://api.openalex.org/authors`
70
72
  - **Parameters**:
71
73
  | Param | Type | Required | Description |
72
74
  |-------|------|----------|-------------|
73
- | query | string | Yes | Author name query |
74
- | offset | integer | No | Pagination offset |
75
- | limit | integer | No | Results per page (max: 1000) |
76
- | fields | string | No | Fields to return (e.g., name,paperCount,citationCount,hIndex) |
75
+ | search | string | No | Author name search |
76
+ | filter | string | No | Filter expression |
77
+ | per_page | integer | No | Results per page (max: 200) |
77
78
  - **Example**:
78
79
  ```bash
79
- curl "https://api.semanticscholar.org/graph/v1/author/search?query=Yoshua+Bengio&fields=name,paperCount,citationCount,hIndex"
80
+ curl "https://api.openalex.org/authors?search=Yoshua+Bengio&per_page=5"
80
81
  ```
81
- - **Response**: JSON with author profiles including publication and citation metrics.
82
+ - **Response**: JSON with author profiles including `works_count`, `cited_by_count`, `summary_stats.h_index`, affiliations.
82
83
 
83
- ### Dataset Releases: Bulk Data Access
84
+ ### CrossRef: Resolve DOI
84
85
 
85
- - **URL**: `GET https://api.semanticscholar.org/datasets/v1/release`
86
+ - **URL**: `GET https://api.crossref.org/works/{doi}`
86
87
  - **Parameters**:
87
88
  | Param | Type | Required | Description |
88
89
  |-------|------|----------|-------------|
89
- | (none) | - | - | Returns list of available dataset releases |
90
+ | doi | string | Yes | DOI to resolve (e.g., `10.1038/nature12373`) |
90
91
  - **Example**:
91
92
  ```bash
92
- curl "https://api.semanticscholar.org/datasets/v1/release"
93
+ curl "https://api.crossref.org/works/10.18653/v1/N19-1423"
93
94
  ```
94
- - **Response**: JSON array of release identifiers (dates) for bulk dataset downloads.
95
+ - **Response**: JSON with full bibliographic metadata including title, authors, journal, dates, references count, and citation count.
95
96
 
96
97
  ## Rate Limits
97
98
 
98
- Without API key: 5,000 requests per 5 minutes (approximately 16.7 requests per second in bursts). With API key: higher sustained throughput, varies by key tier. The API returns HTTP 429 when limits are exceeded. Use the `Retry-After` header value to determine wait time before retrying. Batch endpoints are available for retrieving multiple papers or authors in a single request, which is more efficient than individual lookups.
99
+ OpenAlex: No strict rate limit, but use polite `User-Agent` header. Recommended: max 10 requests per second. The API returns HTTP 429 when limits are exceeded.
100
+
101
+ CrossRef: Without polite pool: ~50 requests per second. With polite pool (contact email in User-Agent): higher limits. The API returns HTTP 429 when limits are exceeded.
99
102
 
100
103
  ## Common Patterns
101
104
 
102
105
  ### Build a Citation Network
103
106
 
104
- Retrieve a paper and its citation tree to map influence:
107
+ Retrieve a paper and find all works that cite it:
105
108
 
106
109
  ```bash
107
- # Get paper with its references and citations
108
- curl "https://api.semanticscholar.org/graph/v1/paper/CorpusID:49313245?fields=title,citations.title,citations.citationCount,references.title,references.citationCount"
110
+ # Get paper details
111
+ curl "https://api.openalex.org/works/W2741809807"
112
+
113
+ # Get works citing this paper, sorted by citation count
114
+ curl "https://api.openalex.org/works?filter=cites:W2741809807&sort=cited_by_count:desc&per_page=20"
109
115
  ```
110
116
 
111
117
  ### Find Influential Papers on a Topic
112
118
 
113
- Search for highly cited and influential works:
119
+ Search for highly cited works on a topic:
114
120
 
115
121
  ```bash
116
- curl "https://api.semanticscholar.org/graph/v1/paper/search?query=graph+neural+networks&fields=title,year,citationCount,influentialCitationCount&limit=20"
122
+ curl "https://api.openalex.org/works?search=graph+neural+networks&sort=cited_by_count:desc&per_page=20"
117
123
  ```
118
124
 
119
- ### Batch Paper Lookup
125
+ ### Batch Paper Lookup via CrossRef
120
126
 
121
- Retrieve metadata for multiple papers in a single request using the batch endpoint:
127
+ Search CrossRef for papers matching a query, sorted by citation count:
122
128
 
123
129
  ```bash
124
- curl -X POST "https://api.semanticscholar.org/graph/v1/paper/batch" \
125
- -H "Content-Type: application/json" \
126
- -d '{"ids": ["DOI:10.1038/s41586-021-03819-2", "CorpusID:49313245"]}' \
127
- --url-query "fields=title,year,citationCount"
130
+ curl "https://api.crossref.org/works?query=graph+neural+networks&sort=is-referenced-by-count&order=desc&rows=20"
128
131
  ```
129
132
 
130
133
  ## References
131
134
 
132
- - Official documentation: https://api.semanticscholar.org/
133
- - API tutorial: https://www.semanticscholar.org/product/api/tutorial
134
- - Semantic Scholar paper: https://arxiv.org/abs/2301.10140
135
+ - OpenAlex documentation: https://docs.openalex.org/
136
+ - CrossRef API documentation: https://api.crossref.org/swagger-ui/index.html
137
+ - OpenAlex source: https://github.com/ourresearch/openalex-guts
@@ -1,9 +1,9 @@
1
1
  ---
2
2
  name: automation-skills
3
- description: "11 research automation skills. Trigger: automating experiments, tracking results, reproducible pipelines. Design: ML experiment management, workflow orchestration, and lab automation tools."
3
+ description: "10 research automation skills. Trigger: automating experiments, tracking results, reproducible pipelines. Design: ML experiment management, workflow orchestration, and lab automation tools."
4
4
  ---
5
5
 
6
- # Research Automation โ€” 11 Skills
6
+ # Research Automation โ€” 10 Skills
7
7
 
8
8
  Select the skill matching the user's need, then `read` its SKILL.md.
9
9
 
@@ -15,7 +15,6 @@ Select the skill matching the user's need, then `read` its SKILL.md.
15
15
  | [data-collection-automation](./data-collection-automation/SKILL.md) | Automate survey deployment, data collection, and pipeline management |
16
16
  | [datagen-research-guide](./datagen-research-guide/SKILL.md) | AI-driven multi-agent research assistant for end-to-end studies |
17
17
  | [kedro-pipeline-guide](./kedro-pipeline-guide/SKILL.md) | Build reproducible data science pipelines with Kedro for research projects |
18
- | [mcp-server-guide](./mcp-server-guide/SKILL.md) | Index of 150 MCP server configs bundled with research-plugins |
19
18
  | [mle-agent-guide](./mle-agent-guide/SKILL.md) | Intelligent companion for ML engineering with arXiv integration |
20
19
  | [paper-to-agent-guide](./paper-to-agent-guide/SKILL.md) | Transform research papers into interactive AI agents for exploration |
21
20
  | [rd-agent-guide](./rd-agent-guide/SKILL.md) | Microsoft AI-driven R&D agent for automated data and model development |
@@ -8,6 +8,7 @@ metadata:
8
8
  openclaw:
9
9
  category: "research"
10
10
  subcategory: "automation"
11
+ emoji: "โš™๏ธ"
11
12
  keywords:
12
13
  - multi-agent
13
14
  - research-assistant
@@ -8,6 +8,7 @@ metadata:
8
8
  openclaw:
9
9
  category: "research"
10
10
  subcategory: "automation"
11
+ emoji: "๐Ÿ”ฌ"
11
12
  keywords:
12
13
  - machine-learning
13
14
  - ml-engineering
@@ -8,6 +8,7 @@ metadata:
8
8
  openclaw:
9
9
  category: "research"
10
10
  subcategory: "automation"
11
+ emoji: "๐Ÿ“„"
11
12
  keywords:
12
13
  - paper-parsing
13
14
  - agent-generation
@@ -82,7 +83,7 @@ The skill supports building knowledge graphs from processed papers:
82
83
 
83
84
  - Extract entities (methods, datasets, metrics, tools, concepts)
84
85
  - Map relationships between entities (uses, extends, contradicts, supports)
85
- - Link to external knowledge bases (Semantic Scholar, OpenAlex, DOI)
86
+ - Link to external knowledge bases (OpenAlex, CrossRef, DOI)
86
87
  - Track citation chains for key claims
87
88
  - Identify research lineages and methodological evolution
88
89
 
@@ -8,6 +8,7 @@ metadata:
8
8
  openclaw:
9
9
  category: "research"
10
10
  subcategory: "deep-research"
11
+ emoji: "๐Ÿ”"
11
12
  keywords:
12
13
  - deep-research
13
14
  - automated-investigation
@@ -52,7 +52,7 @@ Search systematically across source tiers:
52
52
 
53
53
  | Tier | Source Type | Examples | Purpose |
54
54
  |------|-----------|---------|---------|
55
- | **1** | Academic databases | Semantic Scholar, PubMed, Scopus, Web of Science | Peer-reviewed primary research |
55
+ | **1** | Academic databases | OpenAlex, PubMed, Scopus, Web of Science | Peer-reviewed primary research |
56
56
  | **2** | Preprint servers | arXiv, bioRxiv, SSRN, medRxiv | Cutting-edge, not yet reviewed |
57
57
  | **3** | Grey literature | WHO reports, World Bank, NBER working papers | Policy and institutional knowledge |
58
58
  | **4** | Patents and standards | Google Patents, USPTO, IEEE standards | Technical implementations |
@@ -48,7 +48,7 @@ You are an AI Scientist conducting rigorous research.
48
48
  Follow the scientific method strictly:
49
49
 
50
50
  1. **Literature Review**: Search for related work before
51
- proposing anything new. Use Semantic Scholar API.
51
+ proposing anything new. Use OpenAlex API.
52
52
  2. **Hypothesis**: State falsifiable hypotheses clearly.
53
53
  3. **Experiment Design**: Define independent/dependent
54
54
  variables, controls, evaluation metrics.
@@ -62,7 +62,7 @@ Follow the scientific method strictly:
62
62
  ## Tools Available
63
63
  - Python 3.11+ with PyTorch, NumPy, SciPy
64
64
  - LaTeX (pdflatex + bibtex)
65
- - Semantic Scholar API for literature
65
+ - OpenAlex API for literature
66
66
  - W&B for experiment tracking (optional)
67
67
  ```
68
68
 
@@ -153,7 +153,7 @@ Analyze results and write paper:
153
153
  - Method (formal description)
154
154
  - Experiments (setup + results + analysis)
155
155
  - Conclusion (summary + limitations + future)
156
- 5. Verify all citations are real (Semantic Scholar)
156
+ 5. Verify all citations are real (OpenAlex/CrossRef)
157
157
  """
158
158
  ```
159
159
 
@@ -62,7 +62,7 @@ from scientific_agent import HypothesisGenerator
62
62
 
63
63
  generator = HypothesisGenerator(
64
64
  llm_provider="anthropic",
65
- knowledge_sources=["pubmed", "semantic_scholar"],
65
+ knowledge_sources=["pubmed", "openalex"],
66
66
  )
67
67
 
68
68
  hypotheses = generator.generate(
@@ -16,7 +16,7 @@ metadata:
16
16
 
17
17
  Local Deep Research is an open-source deep research tool with over 4,000 GitHub stars that conducts comprehensive multi-source research using either local LLMs (via Ollama, LM Studio, or vLLM) or cloud-based models. It searches across 10+ academic and web sources simultaneously, synthesizes the findings, and produces well-cited research reports. The project is designed for researchers who need thorough, multi-perspective research coverage while maintaining the option to keep everything running locally for privacy.
18
18
 
19
- What makes Local Deep Research stand out is its breadth of search integration. Rather than relying on a single search API, it queries multiple sources in parallel -- including Google Scholar, Semantic Scholar, arXiv, PubMed, Wikipedia, web search engines, and more -- then cross-references and synthesizes the results. This multi-source approach produces more comprehensive and balanced research outputs compared to single-source tools.
19
+ What makes Local Deep Research stand out is its breadth of search integration. Rather than relying on a single search API, it queries multiple sources in parallel -- including Google Scholar, OpenAlex, arXiv, PubMed, Wikipedia, web search engines, and more -- then cross-references and synthesizes the results. This multi-source approach produces more comprehensive and balanced research outputs compared to single-source tools.
20
20
 
21
21
  The tool is particularly well-suited for academic researchers who need to conduct preliminary literature reviews, verify claims across multiple databases, or explore interdisciplinary topics where relevant work may be scattered across different platforms and publication venues.
22
22
 
@@ -94,7 +94,7 @@ from local_deep_research import DeepResearcher
94
94
  researcher = DeepResearcher(
95
95
  llm_provider="ollama",
96
96
  llm_model="llama3.1:70b",
97
- search_sources=["google_scholar", "semantic_scholar",
97
+ search_sources=["google_scholar", "openalex",
98
98
  "arxiv", "web"],
99
99
  max_iterations=10,
100
100
  )
@@ -114,7 +114,7 @@ Local Deep Research queries multiple sources in parallel for each research sub-q
114
114
  | Source | Type | API Key Required | Best For |
115
115
  |--------|------|-----------------|----------|
116
116
  | Google Scholar | Academic | No (via scraping) | Broad academic search |
117
- | Semantic Scholar | Academic | Optional | CS/AI papers, citation data |
117
+ | OpenAlex | Academic | No | Cross-disciplinary, citation data |
118
118
  | arXiv | Academic | No | Preprints, ML/physics/math |
119
119
  | PubMed | Academic | No | Biomedical literature |
120
120
  | Wikipedia | Encyclopedia | No | Background and definitions |
@@ -128,12 +128,12 @@ Local Deep Research queries multiple sources in parallel for each research sub-q
128
128
  # Customize source priorities for your research domain
129
129
  researcher = DeepResearcher(
130
130
  search_sources={
131
- "primary": ["semantic_scholar", "arxiv"],
131
+ "primary": ["openalex", "arxiv"],
132
132
  "secondary": ["google_scholar", "web"],
133
133
  "reference": ["wikipedia", "crossref"],
134
134
  },
135
135
  source_weights={
136
- "semantic_scholar": 1.5, # Prioritize academic sources
136
+ "openalex": 1.5, # Prioritize academic sources
137
137
  "arxiv": 1.5,
138
138
  "web": 0.8,
139
139
  },
@@ -249,5 +249,5 @@ local-deep-research "Your sensitive research query here"
249
249
  - Repository: https://github.com/LearningCircuit/local-deep-research
250
250
  - Ollama: https://ollama.com/
251
251
  - SearXNG: https://github.com/searxng/searxng
252
- - Semantic Scholar API: https://api.semanticscholar.org/
252
+ - OpenAlex API: https://api.openalex.org/
253
253
  - arXiv API: https://info.arxiv.org/help/api/
@@ -43,14 +43,14 @@ result = researcher.research(
43
43
 
44
44
  ```python
45
45
  # Each sub-question triggers:
46
- # - Academic search (Semantic Scholar, arXiv)
46
+ # - Academic search (OpenAlex, arXiv)
47
47
  # - Paper reading (abstract + key sections)
48
48
  # - Evidence extraction
49
49
  # - Follow-up question generation
50
50
 
51
51
  # Configuration
52
52
  researcher = OpenResearcher(
53
- search_backends=["semantic_scholar", "arxiv"],
53
+ search_backends=["openalex", "arxiv"],
54
54
  max_iterations=5, # Research rounds per sub-question
55
55
  papers_per_iteration=10, # Papers to read per round
56
56
  follow_up_questions=True, # Generate follow-up questions
@@ -96,7 +96,7 @@ researcher = OpenResearcher(
96
96
  llm_provider="anthropic",
97
97
  model="claude-sonnet-4-20250514",
98
98
  search_config={
99
- "backends": ["semantic_scholar", "arxiv"],
99
+ "backends": ["openalex", "arxiv"],
100
100
  "max_results_per_query": 20,
101
101
  },
102
102
  reading_config={
@@ -119,12 +119,12 @@ DeepResearch integrates with multiple search providers to cast a wide net:
119
119
  - **Tavily**: AI-optimized search API designed for research agents
120
120
  - **Serper**: Fast Google search results API
121
121
  - **SearXNG**: Self-hosted meta-search engine for privacy-focused deployments
122
- - **Semantic Scholar API**: Direct academic paper search (no API key required for basic access)
122
+ - **OpenAlex API**: Direct academic paper search (free, no API key required)
123
123
 
124
124
  ```python
125
125
  # Configure multiple search backends for comprehensive coverage
126
126
  agent = DeepResearch(
127
- search_engines=["bing", "semantic_scholar"],
127
+ search_engines=["bing", "openalex"],
128
128
  search_strategy="parallel", # Search all engines simultaneously
129
129
  )
130
130
  ```
@@ -151,7 +151,7 @@ Create research profiles optimized for specific academic domains:
151
151
  # Biomedical research profile
152
152
  bio_config = {
153
153
  "preferred_sources": ["pubmed", "biorxiv", "nature", "science"],
154
- "search_engines": ["semantic_scholar", "bing"],
154
+ "search_engines": ["openalex", "bing"],
155
155
  "terminology_mode": "technical",
156
156
  "citation_format": "apa",
157
157
  }
@@ -214,4 +214,4 @@ The trace includes all search queries, retrieved documents, LLM prompts and resp
214
214
  - Repository: https://github.com/Alibaba-NLP/DeepResearch
215
215
  - Qwen model family: https://github.com/QwenLM/Qwen
216
216
  - Alibaba NLP group: https://github.com/Alibaba-NLP
217
- - Semantic Scholar API: https://api.semanticscholar.org/
217
+ - OpenAlex API: https://api.openalex.org/
@@ -17,7 +17,7 @@ Select the skill matching the user's need, then `read` its SKILL.md.
17
17
  | [mixed-methods-guide](./mixed-methods-guide/SKILL.md) | Guide to designing and conducting mixed methods research |
18
18
  | [osf-api-guide](./osf-api-guide/SKILL.md) | Access Open Science Framework for preregistrations, preprints, and data |
19
19
  | [parsifal-slr-guide](./parsifal-slr-guide/SKILL.md) | Plan and manage systematic literature reviews with Parsifal platform |
20
- | [qualitative-research-guide](./qualitative-research-guide/SKILL.md) | Design and conduct qualitative research using grounded theory, case studies, ... |
20
+ | [qualitative-research-guide](./qualitative-research-guide/SKILL.md) | Design and conduct qualitative research using grounded theory and case studies |
21
21
  | [research-paper-kb](./research-paper-kb/SKILL.md) | Build a persistent cross-session knowledge base from academic papers |
22
22
  | [research-town-guide](./research-town-guide/SKILL.md) | Simulate human research communities with multi-agent AI collaboration |
23
23
  | [scientify-idea-generation](./scientify-idea-generation/SKILL.md) | Generate research ideas from collected papers with gap analysis |
@@ -8,6 +8,7 @@ metadata:
8
8
  openclaw:
9
9
  category: "research"
10
10
  subcategory: "methodology"
11
+ emoji: "๐Ÿงช"
11
12
  keywords:
12
13
  - scientific-method
13
14
  - research-skills
@@ -30,7 +30,7 @@ A strong research question is the foundation of any good paper. It should be spe
30
30
  |-----------|-------------|---------------|
31
31
  | **F**easible | Can be answered with available resources | Do you have the data, compute, and time? |
32
32
  | **I**nteresting | Engages the research community | Would peers read this at a top venue? |
33
- | **N**ovel | Not already answered | Has Semantic Scholar search been done? |
33
+ | **N**ovel | Not already answered | Has OpenAlex/CrossRef search been done? |
34
34
  | **E**thical | Follows research ethics standards | Does it require IRB approval? |
35
35
  | **R**elevant | Advances the field meaningfully | Does it connect to open problems? |
36
36
 
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  name: qualitative-research-guide
3
- description: "Design and conduct qualitative research using grounded theory, case studies, ..."
3
+ description: "Design and conduct qualitative research using grounded theory and case studies"
4
4
  metadata:
5
5
  openclaw:
6
6
  emoji: "๐Ÿ”"
@@ -13,7 +13,7 @@ Select the skill matching the user's need, then `read` its SKILL.md.
13
13
  | [latte-review-guide](./latte-review-guide/SKILL.md) | Automate systematic literature reviews with LatteReview AI agents |
14
14
  | [paper-critique-framework](./paper-critique-framework/SKILL.md) | Structured framework for writing peer review reports and paper critiques |
15
15
  | [paper-reading-assistant](./paper-reading-assistant/SKILL.md) | AI-assisted paper reading, PDF Q&A, and summarization workflows |
16
- | [peer-review-guide](./peer-review-guide/SKILL.md) | Conduct thorough, constructive peer reviews and evaluate research papers crit... |
16
+ | [peer-review-guide](./peer-review-guide/SKILL.md) | Conduct thorough, constructive peer reviews and evaluate research papers |
17
17
  | [rebuttal-writing-guide](./rebuttal-writing-guide/SKILL.md) | Write effective rebuttals to reviewer comments for journal submissions |
18
18
  | [review-response-guide](./review-response-guide/SKILL.md) | Craft effective point-by-point reviewer response letters |
19
19
  | [scientify-write-review-paper](./scientify-write-review-paper/SKILL.md) | Write literature reviews and survey papers from collected papers |
@@ -274,7 +274,7 @@ Plagiarism and integrity:
274
274
 
275
275
  Reference management:
276
276
  - scite.ai: smart citation analysis (supporting/contrasting)
277
- - Semantic Scholar: related work discovery
277
+ - OpenAlex: related work discovery
278
278
  - Connected Papers: citation graph visualization
279
279
  ```
280
280
 
@@ -1,6 +1,6 @@
1
1
  ---
2
2
  name: peer-review-guide
3
- description: "Conduct thorough, constructive peer reviews and evaluate research papers crit..."
3
+ description: "Conduct thorough, constructive peer reviews and evaluate research papers"
4
4
  metadata:
5
5
  openclaw:
6
6
  emoji: "๐Ÿ•ต๏ธ"
@@ -86,7 +86,7 @@ For software or experimental system diagrams, use grouped rectangles with labele
86
86
  Input: "Draw a system architecture with three layers:
87
87
  Frontend (React dashboard),
88
88
  Backend (FastAPI + PostgreSQL),
89
- External (Semantic Scholar API, CrossRef API)"
89
+ External (OpenAlex API, CrossRef API)"
90
90
  ```
91
91
 
92
92
  The output places each layer as a dashed-border container with internal component boxes and inter-layer arrows.