@Edent @simon I mean, Nix… but it comes with a whole lot of spooky magic
Conversation
Notices
-
Tom Morris (tommorris@mastodon.social)'s status on Friday, 07-Feb-2025 19:48:41 JST Tom Morris
-
Terence Eden (edent@mastodon.social)'s status on Friday, 07-Feb-2025 19:48:41 JST Terence Eden
@tommorris if I can install it on Android, I'll give it a go!
-
NH4ClO4 φ :nixos: :linux: (ammoniumperchlorate@rheinneckar.social)'s status on Friday, 07-Feb-2025 19:48:41 JST NH4ClO4 φ :nixos: :linux:
@Edent @tommorris sure you can: https://github.com/nix-community/nix-on-droid
-
Terence Eden (edent@mastodon.social)'s status on Friday, 07-Feb-2025 19:48:42 JST Terence Eden
@simon yup. So far I've manually installed Rust, numpy, and a few other things. Getting closer!
One day there will be a package manager which Just Works™.
In conversation permalink -
Terence Eden (edent@mastodon.social)'s status on Friday, 07-Feb-2025 19:48:43 JST Terence Eden
@simon tried running it on my phone, but looks like I'm stuck in dependency hell.
Perhaps I should ask it what I need to do to fix it? 😆In conversation permalink Attachments
-
Simon Willison (simon@fedi.simonwillison.net)'s status on Friday, 07-Feb-2025 19:48:43 JST Simon Willison
@Edent hah, looks like it's trying to find a Rust compiler! I guess maybe for Pydantic which doesn't have binary wheels for that platform?
In conversation permalink -
Simon Willison (simon@fedi.simonwillison.net)'s status on Friday, 07-Feb-2025 19:48:44 JST Simon Willison
If you have uv installed this means you can start chatting with a small model without first installing anything at all - this command will create an ephemeral virtual environment, install the necessary pieces and start a chat UI running in your terminal:
uvx --with llm-smollm2 llm chat -m SmolLM2
In conversation permalink -
Simon Willison (simon@fedi.simonwillison.net)'s status on Friday, 07-Feb-2025 19:48:45 JST Simon Willison
Open challenge: can anyone find a useful application of a model this small?
In conversation permalink -
Simon Willison (simon@fedi.simonwillison.net)'s status on Friday, 07-Feb-2025 19:48:46 JST Simon Willison
Today I learned about SmolLM2-135M-Instruct, a tiny LLM which quantizes down to just below 100MB... which means it can fit in a PyPI package!
Here's the first LLM plugin that includes a full model as part of the package:
llm install llm-smollm2
https://simonwillison.net/2025/Feb/7/pip-install-llm-smollm2/
In conversation permalink
-