Package Details: lmstudio-beta 0.3.9.3-1

Git Clone URL: https://aur.archlinux.org/lmstudio-beta.git (read-only, click to copy)
Package Base: lmstudio-beta
Description: Discover, download, and run local LLMs
Upstream URL: https://lmstudio.ai/
Licenses: custom
Conflicts: lmstudio
Provides: lmstudio
Submitter: envolution
Maintainer: envolution
Last Packager: envolution
Votes: 0
Popularity: 0.000000
First Submitted: 2025-01-18 18:04 (UTC)
Last Updated: 2025-01-30 01:12 (UTC)

Latest Comments

envolution commented on 2025-01-26 01:48 (UTC)

Release Notes - LM Studio 0.3.9 Build 1 (Beta)

Build 1

New: TTL - optionally auto-unload unused API models after a certain amount of time
    Docs: TTL and Auto-Evict
    Set the ttl field in API request payloads
    For command line use: lms load --ttl <seconds>
New: Auto-Evict - optionally auto-unload previously loaded API models before loading new ones (control in App Settings)
Fixed a bug where equations inside model thinking blocks would sometimes generate empty space below the block
Fixed cases where text in toast notifications was not scrollable
Fixed a bug where unchecking and checking Structured Output JSON would make the schema value disappear
Fixed a bug where auto-scroll while generating would sometimes not allow scrolling up
[Developer] Moved logging options to the Developer Logs panel header (••• menu)
Fixed Chat Appearance font size option not scaling text in Thoughts block

envolution commented on 2025-01-18 18:22 (UTC)

3.7 updates:
    New: Hardware tab in Mission Control. Open with Cmd/Ctrl + Shift + H.
    New: Added a server file logging mode option that gives you finer control over what gets logged in the log files.
    New: KV Cache quantization for llama.cpp models (requires llama.cpp/1.9.0+ runtime)
    Added support for nulls in Open AI compatible API server.
    Fixed prediction queueing not working. (queued prediction will return empty results)
    Show runtime update notifications only for currently used runtimes
    Added a descriptive error when LM Studio fails to start due to lack of file system access.
    Fixed a bug where sometimes JIT model loading can cause an error
    Fixed a bug where engine extension's output had an extraneous new line in logs
    Fixed a bug where sometimes two chats will be created for new users.

envolution commented on 2025-01-18 18:05 (UTC)

reintroduced as per https://lmstudio.ai/beta-releases