https://ollama.com>.">

shiny.ollama: R 'shiny' Interface for Chatting with Large Language Models Offline on Local with 'ollama' (original) (raw)

Chat with large language models on your machine without internet with complete privacy via 'ollama', powered by R 'shiny' interface. For more information on 'ollama', visit <https://ollama.com>.

Version: 0.1.1
Depends: R (≥ 3.5.0)
Imports: shiny (≥ 1.7.0), bslib (≥ 0.4.0), httr (≥ 1.4.0), jsonlite (≥ 1.8.0), markdown, mockery
Suggests: testthat (≥ 3.0.0), pkgdown (≥ 2.0.0)
Published: 2025-01-27
DOI: 10.32614/CRAN.package.shiny.ollama
Author: Indraneel ChakrabortyORCID iD [aut, cre]
Maintainer: Indraneel Chakraborty <hello.indraneel at gmail.com>
BugReports: https://github.com/ineelhere/shiny.ollama/issues
License: Apache License (≥ 2)
URL: https://www.indraneelchakraborty.com/shiny.ollama/,https://github.com/ineelhere/shiny.ollama
NeedsCompilation: no
Materials: README
CRAN checks: shiny.ollama results

Documentation:

Downloads:

Linking:

Please use the canonical formhttps://CRAN.R-project.org/package=shiny.ollamato link to this page.