×

Built a local Mac AI assistant – would you actually use something like this?

Built a local Mac AI assistant – would you actually use something like this?

Introducing SuriAI: An AI Assistant for Your Mac—Would You Use It?

Hello everyone,

I am excited to share my latest project with you—a Mac menu bar AI assistant named SuriAI. Developed with the aim of providing a seamless offline experience, SuriAI utilizes local large language models (LLMs) such as MLX, CoreML, and Ollama. Here are some of the key features it offers:

  • Interactive Chat: Engage in conversations using LLMs, formatted with markdown, code snippets, and stream information.
  • System Management: Effortlessly control your system, from opening applications to searching files.
  • Voice and Text Interface: An intuitive interface is on the way, combining both voice and text interactions for enhanced usability.
  • Python Integration: Extend functionality with Python agents based on LangChain, catering to specific needs.

Currently, SuriAI is in its minimum viable product (MVP) stage, and I’m eager to gather your candid feedback.

Would this type of tool pique your interest? Does it seem beneficial, overly ambitious, or perhaps unremarkable?

I want to ensure that I invest my time wisely and create a product that truly meets user needs.

If you’re interested, I’d be happy to share builds or discuss any potential improvements. I also welcome any constructive criticism on the website:

www.suriai.app

Thank you for your support! I’m looking forward to hearing your thoughts.

Post Comment