The essential skills I developed to build these projects were acquired during my course, CSC444: Deep Learning.
"VansAI" is a GUI application built in CSC444 - Deep Learning with Python and tkinter to interact with a local language model (LLM). It features various AI personalities and functionalities. The application initializes by loading a specified LLM model using the llama_cpp library, providing options to select different personalities. Users interact with the AI by entering prompts via a text area, which triggers responses generated by the model based on the input. The interface includes components such as an image display, text areas for displaying conversation history and user prompts, and a button to send prompts for processing. The application also supports text-to-speech output using pyttsx3, enhancing user interaction with spoken responses. The GUI's design emphasizes user-friendliness and customization, making it suitable for engaging with different AI conversational styles and functionalities.
This week, you will learn to work with Local GUI for your local LLM AI assistant.
Objectives:
Practice by creating a new Local GUI for your local LLM AI assistant.
Try out several local quantized chat GPTs from the Bloke.
Evaluate after trying several LLMs locally to find the best ones for your local AI assistant.
Resources
https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-dpo-GGUF/blob/main/dolphin-2.6-mistral-7b-dpo.Q5_K_M.ggufLinks to an external site.
Type the pip install at your command line to install the library that lets our app talk to gguf models.
pip install llama-cpp-python
Once you have everything working, try out different models. Download several different gguf models only from:
https://huggingface.co/TheBlokeLinks to an external site.
Requirements / Expectations:
Create a new folder.
Create a Local GUI for one or more of your favorite local LLMs as your AI assistant using Python.
April 9th, 2024
"VansAI" is a GUI application from CSC444 - Deep Learning with Python and tkinter, designed to interact with a local language model (LLM) featuring various AI personalities. It uses the llama_cpp library for model loading, allowing users to enter prompts and receive responses. The interface includes text areas for prompts and conversation history, an image display, a send button, and text-to-speech support via pyttsx3. The design prioritizes user-friendliness and customization.
In this exciting assignment, you are going to Create Create Avatars for your LLMs.
Guidelines and Resources:
https://huggingface.co/TheBloke/dolphin-2.6-mistral-7B-dpo-GGUF/blob/main/dolphin-2.6-mistral-7b-dpo.Q5_K_M.ggufLinks to an external site.
To create avatars:
https://www.bing.com/images/createLinks to an external site.
https://creator.nightcafe.studio/Links to an external site.
Expectations / Requirements:
April 19th, 2024
This application was built in CSC444 - Deep Learning it offers a local GUI interface for interacting with a Language Learning Model (LLM). Users can generate model responses, save conversations, and load past sessions. The tool also features a pixel art drawing pad with color selection, undo functionality, and the ability to save artwork. Built with Python's tkinter and Pillow libraries, it provides an engaging way to interact with AI models and create pixel art.
You made it to your final project! Congratulations! Your final project will be an app using the skills you learned in this class or even something that interests you that we have not covered in this course yet.
Guidelines:
You will use some of your favorite new skills from this course to build a new app of your choice for your final. If you cannot think of an idea, please ping me on MS Teams. I would be happy to help!
Requirements and Expectations:
April 24th, 2024
Add a short description.
Copyright © 2024 Artvise AI - All Rights Reserved.
We use cookies to analyze website traffic and optimize your website experience. By accepting our use of cookies, your data will be aggregated with all other user data.