shiny.ollama: R 'Shiny' Interface for Chatting with Large Language Models Offline on Local with 'ollama'

Chat with large language models on your machine without internet with complete privacy via 'ollama', powered by R shiny interface. For more information on 'ollama', visit <https://ollama.com>.

Version: 0.1.0
Depends: R (≥ 3.5.0)
Imports: shiny (≥ 1.7.0), bslib (≥ 0.4.0), httr (≥ 1.4.0), jsonlite (≥ 1.8.0), markdown, mockery
Suggests: testthat (≥ 3.0.0), pkgdown (≥ 2.0.0)
Published: 2025-01-16
DOI: 10.32614/CRAN.package.shiny.ollama
Author: Indraneel Chakraborty ORCID iD [aut, cre]
Maintainer: Indraneel Chakraborty <hello.indraneel at gmail.com>
BugReports: https://github.com/ineelhere/shiny.ollama/issues
License: Apache License (≥ 2)
URL: https://www.indraneelchakraborty.com/shiny.ollama/, https://github.com/ineelhere/shiny.ollama
NeedsCompilation: no
Materials: README
CRAN checks: shiny.ollama results

Documentation:

Reference manual: shiny.ollama.pdf

Downloads:

Package source: shiny.ollama_0.1.0.tar.gz
Windows binaries: r-devel: shiny.ollama_0.1.0.zip, r-release: not available, r-oldrel: shiny.ollama_0.1.0.zip
macOS binaries: r-release (arm64): shiny.ollama_0.1.0.tgz, r-oldrel (arm64): shiny.ollama_0.1.0.tgz, r-release (x86_64): not available, r-oldrel (x86_64): not available

Linking:

Please use the canonical form https://CRAN.R-project.org/package=shiny.ollama to link to this page.