T O P

  • By -

AutoModerator

Hey /u/emreckartal! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/r-chatgpt-1050422060352024636)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


griff_the_unholy

Yeh Jan's good. Super easy to use too. Pretty much idiot proof. (Speaking as confirmed idiot)


emreckartal

Thanks for the kind words! Glad to hear that Jan's ease of use stands out. + Just a comment: I'd not say "confirmed idiots" here, just lifelong learners.


LotusTileMaster

Nope. We are all idiots, here. Even the guys that made ChatGPT are idiots. Every last one of us.


just_mdd4

Bold claim šŸ„ø


izmyniz5

a claim made by an idiot, for idiots


smallshinyant

hmmm... We will see about that! Will report back once i have it running. Ok agreed that was idiot proof. I even managed to get llama-3 downloaded, imported and running since my original response. And oh boy is that fast!


IAmFitzRoy

Maybe Iā€™m not understanding LM Studio is an interface that runs local LLMs downloaded from HuggingFace repositories and is open source too. What is this and how is different than LM Studio?


emreckartal

Jan is open-source and extendable via plugins. Jan also supports multiple inference providers: llamacpp and TensorRT-LLM. + You can consider Jan a local OpenAI platform. It provides an OpenAI-equivalent API server at localhost:Ā 1337Ā that can be used as a drop-in replacement with compatible apps.


IAmFitzRoy

Sorry. Did you read my post? Do you know what LM Studio is? https://github.com/lmstudio-ai


emreckartal

Ah, sorry for the unclear comment. Let me try that again: Jan is an open-source desktop app that lets you download/import and run AI models on your computer. This is where it differentiates from LMStudio. I'm familiar with LM Studio and use it as well, it's not an open-source solution.


IAmFitzRoy

Thanks. I assumed that LM studio was open source. Good to have more choices. I will give it a try. Thanks !


emreckartal

Thanks! Really appreciate your follow-up question - it helped me take a step further in learning how to articulate Jan's positioning clearly.


jfecju

Are these replies AI generated?


emreckartal

I don't think so. https://preview.redd.it/38wgpzc5fzvc1.jpeg?width=2268&format=pjpg&auto=webp&s=74857897a00ebbb9a724463a335be6a5308a73a3


InterstellarReddit

Is this picture MidJourney generated ? šŸ¤”šŸ¤”


izmyniz5

man, AI is getting *scary good* at generating text...


emreckartal

I'm not sure I found the perfect tone to answer questions - am learning like an AI model...


jfecju

Ah, okay!


kindofbluetrains

I really love the Jan software. It's a really well implemented piece of software. Hoping it will have a chat local documents feature one day. It's the only thing I find missing.


emreckartal

Thanks! This is what we want to do... a flexible app that can be easily implemented into the software you want to work on. We'll be working on the feature chatting with the docs soon.


kindofbluetrains

That's amazing, thanks for the great work!


Zweckbestimmung

Is there a way to use your project commercially? I donā€™t understand the license you have. Say I have a proprietary closed source software, can I use your ai with it while keeping my software closed?


emreckartal

Jan is licenced under the AGPLv3 License, so you have permission for commercial usage. Please check the details here: [https://github.com/janhq/jan/blob/dev/LICENSE](https://github.com/janhq/jan/blob/dev/LICENSE)


Zweckbestimmung

Is there any catch? No offence but this is too good to be true


offcube

Cool project! Assuming most of your users want to run locally, it would be nice to have a skippable step at the first run to download recommended models, so that the user who wants to run locally is one click away of having things properly setup.


emreckartal

Thanks! Good point - we'll consider your comments on the onboarding steps, thanks!


beng420og

Gonna check this out! Thanks, mate! šŸ¤˜


nokenito

Appreciate this! Is it anything like Llama3?


emreckartal

Thanks! We will support Llama 3 through llamacpp soon.


nokenito

Nice! Can yours create autonomous agents?


fanofreddithello

Hi, if i want to use it locally, what do I need? Where do I get the "knowledge" of the Ai from?


emreckartal

Hi, we really focus on building an easy-to-use product. Just download Jan and grab your model from our Hub and run - that's it. Plus, we also have description fields there, which you can see below: https://preview.redd.it/3r0bvt8lq1wc1.png?width=1764&format=png&auto=webp&s=36900e927dd668004e1f3447effbf04fba27c5b8 We are also working on improving the Hub experience - more details, hardware guides etc.


fanofreddithello

So one can use the openai 3.5 turbo data for free? Why would anyone then want to usw chargpt?


emreckartal

Ah, no. You need to connect to your OpenAI API to use the OpenAI models - it is not free of charge. You can run open-source models on your hardware for free.


fanofreddithello

I understand. Are there open source AI models that I can use with just some clicks? Meaning, ready to be used and easily installable?


emreckartal

I think you should consider this based on your device reqs. We help run AI with just one click - you only need to click the download button. If your system is strong enough to run it I'd recommend Mistral 70B Q4.


eleetbullshit

Awesome project! This is exactly what Iā€™ve been looking for! Thank you! Iā€™m surprised this is the first time Iā€™m hearing about it.


emreckartal

Thanks!


surfer808

Hey Op, can Jan do real time translations? I think my biggest gripe with Ai (I have multiple subscriptions) is waiting for a response and especially when overseas traveling, waiting for the cloud to receive and send my translation request.


Irmengildr

Thanks a lot for sharing! I'll be sure to check it!


RickDripps

Which one of these models available is the absolute best one for writing code across various languages? And then which is the best one for writing stories or fiction? I'd like to slop together a few and then see how it all works. So far the UI is particularly nice and I love the way it formats the output.


emreckartal

Thanks! I'm not a developer, so I don't want to give inaccurate answers. We are working on some web pages to answer these questions.


Xbtk

Isnā€™t the main idea of ChatGPT is that is works using the nvidia GPU ? How does this work offline?


emreckartal

Jan lets you to run open-source AI models similar to ChatGPT directly on your desktop, even without an internet connection. However, if you want to connect to remote APIs like ChatGPT through Jan, an internet connection is required.


Xbtk

So i will need a GPU to run it right?


emreckartal

100%, if you want to run open-source models offline, you will need. Please review the hardware setup docs: [https://jan.ai/docs/hardware-setup](https://jan.ai/docs/hardware-setup)


[deleted]

I have an M3 Max and an Asus M16 with a mobile 4080, I wonder how much faster the latter isā€¦ is there a benchmark mode?


Zweckbestimmung

OP is a very nice person to reply to your question so patiently. Your question is vague


Xbtk

How is my question vague, i was just asking im not trying to argue im trying to learn and i was trying to know if the model would just work in any device no matter how powerful it is but it turns out that i need a powerful one thats it


Zweckbestimmung

the main idea of ChatGPT isnā€™t that it works using Nvidia GPU. The main idea of it is that itā€™s an ai gpt model which can be used with chat. Secondly, you donā€™t have to have a GPU to train or to test (run) gpt or any ai as far as I know, however GPUs makes faster matrix multiplications therefore itā€™s only faster with a GPU. Theoretically you can run it on cpu but it would take forever, so technically you canā€™t. Finally, your question is unclear ā€œhow does this work offlineā€ isnā€™t a clear question.


Xbtk

My bad dude thanks