I am looking to run some testing on a local ollama model in AI Studio. However, every time I open AI Studio, I have to redownload ollama and pull llama3 to use from my notebook. Any suggestion for how to save that download (done from terminal) so that it is already loaded when I open AI Studio?
AI Studio supports pip and conda installation persistence in a workspace container. However, containers are placed in a separate network from the OS's localhost network. This means that native OS processes and our containerized processes can't communicate over the network, so the container is not be able to connect to a server running on local host.
Since ollama (OpenLDAP Account Manager) is a web based tool to manage LDAP accounts and servers, to containerize ollama you need a Dockerfile to build a custom image and a locally hosted service. Today AI Studio does not support creation or importing of custom images or the communication between the localhost network and the container for non-base images and services (e.g. MLFlow, Tensorboard, Swagger API).
That said, feedback like this is greatly appreciated and we're happy to share that these features are on the roadmap. Users are informed of new features as they become available... stay tuned!
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.