-
AI Augmented Investigative Journalism
by kaeru
—
published
Jul 20, 2025
—
last modified
Aug 14, 2025 09:12 PM
—
filed under:
anticorruption,
rocm,
investigative journalism,
AI,
docling,
lighrag,
open-webui
Initial explorations in using current AI tools and methods to supplement existing investigative journalism techniques
Located in
Notes
-
Local AI Assisted Development with OpenCode, Qwen 3.6 and llama.cpp
by kaeru
—
published
Apr 24, 2026
—
last modified
Apr 27, 2026 06:52 PM
—
filed under:
rocm,
AI,
opencode,
plone,
darktable
With release of Qwen 3.6 35B A3B, AI assisted development is now useful on local setup with limited VRAM.
Located in
Notes
-
Processing Scanned Documents with AI
by kaeru
—
published
Aug 14, 2025
—
last modified
Aug 15, 2025 02:08 PM
—
filed under:
AI,
VLM,
docling
Older document processing methods still need to be used in conjunction with current AI Visual Language Models (VLMS) when dealing with badly scanned documents.
Located in
Notes
-
RAG with newer 30B models, Google Pinpoint and VLMs
by kaeru
—
published
Oct 26, 2025
—
last modified
Oct 28, 2025 10:51 AM
—
filed under:
VLM,
AI,
investigative journalism,
darktable,
lua,
google
LightRAG with Qwen3 30B Thinking model, Google Pinpoint and VLMs to speed up journalism workflows
Located in
Notes
-
Cleaning up Documents with VLMs
by kaeru
—
published
Dec 02, 2025
—
last modified
Dec 27, 2025 12:00 PM
—
filed under:
AI,
VLM,
1MDB,
anticorruption
Cleaning up documents, this time with VLMs and AI generated one off scripts
Located in
Notes
-
Querying Federation of Malaya Budget from 1959
by kaeru
—
published
Dec 05, 2025
—
last modified
Dec 05, 2025 10:09 PM
—
filed under:
VLM,
AI,
govdocs
Current state of VLMs allows us to query accurately printed tables from the late 1950's
Located in
Notes
-
Local AI with AMD Radeon 9070 XT on Ubuntu Linux 25.04 with ROCm 6.4.1
by kaeru
—
published
Jul 06, 2025
—
last modified
Aug 14, 2025 09:14 PM
—
filed under:
AI,
rocm,
ubuntu,
ollama,
llamacpp,
vllm,
docling
Everything seems to be working so far
Located in
Notes
-
ROCm and Vulkan on Ubuntu VM on FreeBSD with Bhyve GPU Passthrough
by kaeru
—
published
May 05, 2026
—
last modified
May 06, 2026 06:52 PM
—
filed under:
freebsd,
ubuntu,
AI,
rocm,
vulkan,
llama.cpp,
bhyve
With FreeBSD 15 and bhyve hypervisor you can can run and serve GPU powered local AI and compute services such as llama.cpp using ROCm and Vulkan with AMD GPUs ...
Located in
Notes