Package Details: gpt4all-chat-git 3.3.0.r22.g767189d-1

Git Clone URL: https://aur.archlinux.org/gpt4all-chat-git.git (read-only, click to copy)
Package Base: gpt4all-chat-git
Description: Cross platform Qt based GUI for GPT4All versions
Upstream URL: https://github.com/nomic-ai/gpt4all
Keywords: chatgpt gptj.cpp gui llama.cpp offline
Licenses: MIT
Conflicts: gpt4all-chat
Provides: gpt4all-chat
Submitter: jgmdev
Maintainer: jgmdev
Last Packager: jgmdev
Votes: 14
Popularity: 0.084763
First Submitted: 2023-05-12 17:04 (UTC)
Last Updated: 2024-10-04 04:24 (UTC)

Latest Comments

« First ‹ Previous 1 2 3 4 5 Next › Last »

jgmdev commented on 2023-06-04 19:14 (UTC)

Had to move installation to /opt/gpt4all-chat and copy all libs as pointed by @Gilfoyle, seems to be working now.

Gilfoyle commented on 2023-06-03 21:01 (UTC)

Tried a bit around: make install, builds 3 libs: - "libllmodel.so" - "libllmodel.so.0" - "libllmodel.so.0.2.0"

But in projdir/build/bin/ are more libs, when you add one lib "libgptj-default.so" to the 3 libs above and have the binary "chat" in the same folder it works. Unfortunately, it doesn't work when you copy all libs into their folder /lib/.

I do not know if this will help to fix this error, but I hope so.

Elmario commented on 2023-06-03 19:20 (UTC)

Thank you, thats working! They shouldn't call the installer 'Ubuntu' ...

m78 commented on 2023-06-03 19:08 (UTC)

follow this guide to get it to work: https://www.jeremymorgan.com/blog/linux/gpt-for-arch-linux/

if you use the installer, there is also the missing test_hw file. maybe this is the reason why it does not work from aur.

burster commented on 2023-06-03 01:03 (UTC)

Can confirm the last two posts. Install failed like @Elmario described, and it isn't loading models, see @Gilfoyle. Upstream installer version is working with same config.

Elmario commented on 2023-06-02 19:57 (UTC)

How would I modify this to build AVX1 compatible? (Btw: I had to comment the following line in PKGBUILD to make ot build successfully: " # mv "${pkgdir}/usr/bin/test_hw" "${pkgdir}/usr/bin/gpt4all-test_hw", because gpt4-all-test_hw does not exist.

Gilfoyle commented on 2023-06-02 17:38 (UTC)

I am experiencing a similar error as @Domroon. The program indicates that the models are being loaded, but I have doubts because my CPU usage remains below 2%.

jgmdev commented on 2023-06-01 16:01 (UTC)

Added the other reported dependencies.

@cyqsimon, instead of setting the make flags directly on the PKGBUILD you can instead:

cp /etc/makepkg.conf ~/.makepkg.conf

and modify the the line with MAKEFLAGS eg:

This would basically use the maximum cores and not exceed a maximum cpu load of 90%

MAKEFLAGS="-l 90 -j"

Domroon commented on 2023-06-01 11:20 (UTC) (edited on 2023-06-01 11:31 (UTC) by Domroon)

You need to install the following packages:

sudo pacman -S qt6-shadertools qt6-5compat

Don't forget to add a model in the right folder. In my case the model loads forever. I don't know what's going on.

Gilfoyle commented on 2023-06-01 01:37 (UTC)

I am getting this error when I try to run gpt4all-chat. Can somebody please help me?


deserializing chats took: 1 ms
QQmlApplicationEngine failed to load component
qrc:/gpt4all/main.qml:6:1: module "Qt5Compat.GraphicalEffects" is not installed
serializing chats took: 0 ms
QCoreApplication::applicationDirPath: Please instantiate the QApplication object first

Thanks for maintaining.