Build A Privacy-First AI Search Engine For Free With TurboSeek

A powerful & open-source AI search engine that combines LLMs, web scraping, and the Bing API for secure and informative searches.

TurboSeek is an open-source AI search engine that provides a privacy-focused alternative to Perplexity and Google Search.

Inspired by Perplexity AI, TurboSeek utilizes Mixtral 8x7B and Meta’s Llama-3 for its large language models (LLMs) and Bing for its search API.

Subscribe to our newsletter and get the top 10 AI tools and apps delivered straight to your inbox. Subscribe now!

This combination allows TurboSeek to offer a unique blend of AI-powered search functionality while prioritizing user data privacy.

How it works:

When a user submits a query, TurboSeek fetches the top six results from Bing’s search API. It then scrapes text from those links to create context.

Next, it sends the query and context to Mixtral 8x7B to deliver the answer to the user.

Additionally, Llama-3 generates three related questions that encourage the user to explore further and gain a deeper understanding of the topic.

How to use it:

1. Fork or clone the repo from GitHub:

 git clone https://github.com/Nutlope/turboseek.git

2. Set Up Inference and API Accounts:

  • Create an account at Together AI for large language model inference.
  • Sign up with Azure to obtain a Bing search API key.
  • Register at Helicone for observability services.

3. Rename .example.env to .env and replace API keys:

TOGETHER_API_KEY=
BING_API_KEY=
HELICONE_API_KEY=

4. Install Dependencies and Run Locally:

  • Execute npm install to install all required dependencies.
  • Start the search engine on your local machine by running npm run dev.

5. Enter your queries in the TurboSeek web interface. TurboSeek will utilize Mixtral 8x7B and Llama-3 along with Bing’s search API to answer your questions. The source of the answer will be displayed above the response, and related topics will be provided below to encourage further exploration.

TurboSeek AI search engine example

FAQs:

Q: What makes TurboSeek different from traditional search engines like Google?
A: TurboSeek prioritizes privacy by allowing users to deploy it on the local. This makes sure your search data remains secure within their local infrastructure. It also uses LLMs to provide contextual and informative responses.

Q: Can TurboSeek be used for commercial purposes?
A: Yes, TurboSeek is an open-source project, and users can deploy and utilize it for commercial purposes as long as they comply with the project’s license terms.

Q: What technology powers TurboSeek?
TurboSeek is powered by Next.js, Tailwind, Together AI for LLM inference, and Mixtral 8x7B & Llama-3 models, with Bing for search API tasks.

Q: What are LLMs?
A: Large language models (LLMs) are AI systems trained on massive datasets of text and code. They can generate text, translate languages, write different kinds of creative content, and answer your questions in an informative way. Learn more here.

Leave a Reply

Your email address will not be published. Required fields are marked *

Get the latest & top AI tools sent directly to your email.

Subscribe now to explore the latest & top AI tools and resources, all in one convenient newsletter. No spam, we promise!