Phi-3+ContinueDev+Ollama: STOP PAYING for Github's Copilot with this NEW, LOCAL & FREE Alternative

Published: 25 April 2024
on channel: AICodeKing
7,191
362

In this video, I'll be telling you that how you can create a free and self hosted local alternative to Github's Copilot & Gemini Code Assist that can run on any computer even without a GPU and is extremely fast. It is based on Microsoft's new opensource LLM PHI-3-mini-3.8b model. This alternative can be used with any opensource LLM such as Mixtral 8x22b, Mixtral 8x7b, GPT-4, Grok-1.5 & Gemini Code Assist.


[Key Takeaways]
💻 Discover how to create a local AI coding assistant using Microsoft's Phi-3 model, a powerful alternative to GitHub's Copilot that's on par with Llama 3 8b and GPT-3.5.
🔧 Learn how to set up a fast and local Copilot alternative that can do everything Copilot can, without relying on third-party APIs or self-hosting large AI models.
💡 Get started with Ollama and the ContinueDev VSCode extension to create a seamless AI-powered coding experience that integrates with your existing workflow.
📊 Take advantage of the Phi-3 model's 128K context version to work with larger code bases and improve your overall coding productivity.
💻 Explore the capabilities of this local AI coding assistant, including code generation, auto-completion, and code explanation, all within your local environment.
👍 Find out why this Copilot alternative is accessible to most computers, making it a great option for developers who want a local AI coding solution without the need for powerful hardware