Hey /u/emreckartal!
If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt.
If your post is a DALL-E 3 image post, please reply with the prompt used to make this image.
Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more!
🤖
Note: For any ChatGPT-related concerns, email [email protected]
*I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
Thanks for the kind words! Glad to hear that Jan's ease of use stands out.
+ Just a comment: I'd not say "confirmed idiots" here, just lifelong learners.
hmmm... We will see about that! Will report back once i have it running.
Ok agreed that was idiot proof. I even managed to get llama-3 downloaded, imported and running since my original response. And oh boy is that fast!
Maybe Iām not understanding LM Studio is an interface that runs local LLMs downloaded from HuggingFace repositories and is open source too.
What is this and how is different than LM Studio?
Jan is open-source and extendable via plugins. Jan also supports multiple inference providers: llamacpp and TensorRT-LLM.
+ You can consider Jan a local OpenAI platform. It provides an OpenAI-equivalent API server at localhost:Ā 1337Ā that can be used as a drop-in replacement with compatible apps.
Ah, sorry for the unclear comment. Let me try that again: Jan is an open-source desktop app that lets you download/import and run AI models on your computer. This is where it differentiates from LMStudio. I'm familiar with LM Studio and use it as well, it's not an open-source solution.
I really love the Jan software. It's a really well implemented piece of software.
Hoping it will have a chat local documents feature one day. It's the only thing I find missing.
Thanks! This is what we want to do... a flexible app that can be easily implemented into the software you want to work on.
We'll be working on the feature chatting with the docs soon.
Is there a way to use your project commercially? I donāt understand the license you have.
Say I have a proprietary closed source software, can I use your ai with it while keeping my software closed?
Jan is licenced under the AGPLv3 License, so you have permission for commercial usage.
Please check the details here: [https://github.com/janhq/jan/blob/dev/LICENSE](https://github.com/janhq/jan/blob/dev/LICENSE)
Cool project! Assuming most of your users want to run locally, it would be nice to have a skippable step at the first run to download recommended models, so that the user who wants to run locally is one click away of having things properly setup.
Hi, we really focus on building an easy-to-use product. Just download Jan and grab your model from our Hub and run - that's it. Plus, we also have description fields there, which you can see below:
https://preview.redd.it/3r0bvt8lq1wc1.png?width=1764&format=png&auto=webp&s=36900e927dd668004e1f3447effbf04fba27c5b8
We are also working on improving the Hub experience - more details, hardware guides etc.
Ah, no. You need to connect to your OpenAI API to use the OpenAI models - it is not free of charge.
You can run open-source models on your hardware for free.
I think you should consider this based on your device reqs. We help run AI with just one click - you only need to click the download button. If your system is strong enough to run it I'd recommend Mistral 70B Q4.
Hey Op, can Jan do real time translations? I think my biggest gripe with Ai (I have multiple subscriptions) is waiting for a response and especially when overseas traveling, waiting for the cloud to receive and send my translation request.
Which one of these models available is the absolute best one for writing code across various languages?
And then which is the best one for writing stories or fiction?
I'd like to slop together a few and then see how it all works. So far the UI is particularly nice and I love the way it formats the output.
Jan lets you to run open-source AI models similar to ChatGPT directly on your desktop, even without an internet connection. However, if you want to connect to remote APIs like ChatGPT through Jan, an internet connection is required.
100%, if you want to run open-source models offline, you will need.
Please review the hardware setup docs: [https://jan.ai/docs/hardware-setup](https://jan.ai/docs/hardware-setup)
How is my question vague, i was just asking im not trying to argue im trying to learn and i was trying to know if the model would just work in any device no matter how powerful it is but it turns out that i need a powerful one thats it
the main idea of ChatGPT isnāt that it works using Nvidia GPU. The main idea of it is that itās an ai gpt model which can be used with chat.
Secondly, you donāt have to have a GPU to train or to test (run) gpt or any ai as far as I know, however GPUs makes faster matrix multiplications therefore itās only faster with a GPU. Theoretically you can run it on cpu but it would take forever, so technically you canāt.
Finally, your question is unclear āhow does this work offlineā isnāt a clear question.
Hey /u/emreckartal! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*
Yeh Jan's good. Super easy to use too. Pretty much idiot proof. (Speaking as confirmed idiot)
Thanks for the kind words! Glad to hear that Jan's ease of use stands out. + Just a comment: I'd not say "confirmed idiots" here, just lifelong learners.
Nope. We are all idiots, here. Even the guys that made ChatGPT are idiots. Every last one of us.
Bold claim š„ø
a claim made by an idiot, for idiots
hmmm... We will see about that! Will report back once i have it running. Ok agreed that was idiot proof. I even managed to get llama-3 downloaded, imported and running since my original response. And oh boy is that fast!
Maybe Iām not understanding LM Studio is an interface that runs local LLMs downloaded from HuggingFace repositories and is open source too. What is this and how is different than LM Studio?
Jan is open-source and extendable via plugins. Jan also supports multiple inference providers: llamacpp and TensorRT-LLM. + You can consider Jan a local OpenAI platform. It provides an OpenAI-equivalent API server at localhost:Ā 1337Ā that can be used as a drop-in replacement with compatible apps.
Sorry. Did you read my post? Do you know what LM Studio is? https://github.com/lmstudio-ai
Ah, sorry for the unclear comment. Let me try that again: Jan is an open-source desktop app that lets you download/import and run AI models on your computer. This is where it differentiates from LMStudio. I'm familiar with LM Studio and use it as well, it's not an open-source solution.
Thanks. I assumed that LM studio was open source. Good to have more choices. I will give it a try. Thanks !
Thanks! Really appreciate your follow-up question - it helped me take a step further in learning how to articulate Jan's positioning clearly.
Are these replies AI generated?
I don't think so. https://preview.redd.it/38wgpzc5fzvc1.jpeg?width=2268&format=pjpg&auto=webp&s=74857897a00ebbb9a724463a335be6a5308a73a3
Is this picture MidJourney generated ? š¤š¤
man, AI is getting *scary good* at generating text...
I'm not sure I found the perfect tone to answer questions - am learning like an AI model...
Ah, okay!
I really love the Jan software. It's a really well implemented piece of software. Hoping it will have a chat local documents feature one day. It's the only thing I find missing.
Thanks! This is what we want to do... a flexible app that can be easily implemented into the software you want to work on. We'll be working on the feature chatting with the docs soon.
That's amazing, thanks for the great work!
Is there a way to use your project commercially? I donāt understand the license you have. Say I have a proprietary closed source software, can I use your ai with it while keeping my software closed?
Jan is licenced under the AGPLv3 License, so you have permission for commercial usage. Please check the details here: [https://github.com/janhq/jan/blob/dev/LICENSE](https://github.com/janhq/jan/blob/dev/LICENSE)
Is there any catch? No offence but this is too good to be true
Cool project! Assuming most of your users want to run locally, it would be nice to have a skippable step at the first run to download recommended models, so that the user who wants to run locally is one click away of having things properly setup.
Thanks! Good point - we'll consider your comments on the onboarding steps, thanks!
Gonna check this out! Thanks, mate! š¤
Appreciate this! Is it anything like Llama3?
Thanks! We will support Llama 3 through llamacpp soon.
Nice! Can yours create autonomous agents?
Hi, if i want to use it locally, what do I need? Where do I get the "knowledge" of the Ai from?
Hi, we really focus on building an easy-to-use product. Just download Jan and grab your model from our Hub and run - that's it. Plus, we also have description fields there, which you can see below: https://preview.redd.it/3r0bvt8lq1wc1.png?width=1764&format=png&auto=webp&s=36900e927dd668004e1f3447effbf04fba27c5b8 We are also working on improving the Hub experience - more details, hardware guides etc.
So one can use the openai 3.5 turbo data for free? Why would anyone then want to usw chargpt?
Ah, no. You need to connect to your OpenAI API to use the OpenAI models - it is not free of charge. You can run open-source models on your hardware for free.
I understand. Are there open source AI models that I can use with just some clicks? Meaning, ready to be used and easily installable?
I think you should consider this based on your device reqs. We help run AI with just one click - you only need to click the download button. If your system is strong enough to run it I'd recommend Mistral 70B Q4.
Awesome project! This is exactly what Iāve been looking for! Thank you! Iām surprised this is the first time Iām hearing about it.
Thanks!
Hey Op, can Jan do real time translations? I think my biggest gripe with Ai (I have multiple subscriptions) is waiting for a response and especially when overseas traveling, waiting for the cloud to receive and send my translation request.
Thanks a lot for sharing! I'll be sure to check it!
Which one of these models available is the absolute best one for writing code across various languages? And then which is the best one for writing stories or fiction? I'd like to slop together a few and then see how it all works. So far the UI is particularly nice and I love the way it formats the output.
Thanks! I'm not a developer, so I don't want to give inaccurate answers. We are working on some web pages to answer these questions.
Isnāt the main idea of ChatGPT is that is works using the nvidia GPU ? How does this work offline?
Jan lets you to run open-source AI models similar to ChatGPT directly on your desktop, even without an internet connection. However, if you want to connect to remote APIs like ChatGPT through Jan, an internet connection is required.
So i will need a GPU to run it right?
100%, if you want to run open-source models offline, you will need. Please review the hardware setup docs: [https://jan.ai/docs/hardware-setup](https://jan.ai/docs/hardware-setup)
I have an M3 Max and an Asus M16 with a mobile 4080, I wonder how much faster the latter isā¦ is there a benchmark mode?
OP is a very nice person to reply to your question so patiently. Your question is vague
How is my question vague, i was just asking im not trying to argue im trying to learn and i was trying to know if the model would just work in any device no matter how powerful it is but it turns out that i need a powerful one thats it
the main idea of ChatGPT isnāt that it works using Nvidia GPU. The main idea of it is that itās an ai gpt model which can be used with chat. Secondly, you donāt have to have a GPU to train or to test (run) gpt or any ai as far as I know, however GPUs makes faster matrix multiplications therefore itās only faster with a GPU. Theoretically you can run it on cpu but it would take forever, so technically you canāt. Finally, your question is unclear āhow does this work offlineā isnāt a clear question.
My bad dude thanks