Hey CoPilot: search for documents in the company database that relate to the topic and write a comprehensive list of security questions that will be included in an RFP to evaluate how a vendor’s security standards measure up to what we need to align with?
Imagine what was required, before AI and Large Language Models (LLMs) to have to draft such a list of questions. Now, you could have a super-smart AI assistant living right on your company network to get you started on such a task.
The question, though, is: Should your network and the LLM live on the cloud or be hosted on servers in your offices (oh! the days of local server centers!)
Guess what? There should be a stir in the tech world for such use cases and see what they could mean for businesses and their trusty cloud services.[^1]
What’s the Big Deal?
You’ve probably heard of ChatGPT and other AI chatbots that seem to know everything. These clever systems are powered by something called “language models.” Think of them as the brains behind AI that can understand and generate human-like text. Until recently, these brainy AIs lived exclusively in the cloud, requiring an internet connection to work their magic.
But now, there’s a new kid on the block: open-source language models that can run right on your own computer or server. It’s like having your own personal AI genius living in your basement![^2]
Why should Companies be Excited?
1. Privacy is the New Black: With local AI, your data stays at home. No more worrying about sensitive info taking a trip through the internet![^3]
2. Speed Demons: Local AI can be lightning-fast since it doesn’t need to send data back and forth to the cloud.[^4]
3. Penny-Pinching Potential: For heavy users, running AI locally might be cheaper in the long run than paying for cloud services.[^5]
4. Customization Nation: Companies can teach their local AI their own secret sauce, making it extra smart about their specific business.[^6]
Where Could This Local AI Shine?
1. Company Intranets on Steroids: Imagine a super-smart search engine that knows all your company’s secrets (in a good way).
2. Workflow Wizardry: AI-powered tools that make your job easier without exposing sensitive data.
3. Learning and Sharing 2.0: Training programs and knowledge bases that are both brilliant and secure.
4. Customer Support Superheroes: Chatbots that can help customers quickly while keeping their info under wraps.[^7]
DIY AI: Tools of the Trade
For the tech-savvy companies ready to dive in, here are some popular options for deploying local language models:[^8]
1. Hugging Face: A treasure trove of AI models and tools (no actual hugging involved).
2. LangChain: A toolkit for building custom AI applications (chainmail not included).
3. Ollama: An easy-to-use option for those who don’t want to get lost in technical details.
What Could This Mean for Cloud Services?
While this local AI revolution is exciting, it doesn’t mean cloud services are going the way of the dinosaur. Instead, we’re likely to see a mix-and-match approach:[^9]
– Some companies might go all-in on local AI for super-sensitive stuff.
– Others might use a combo of local and cloud AI, depending on the task.
– Cloud services will probably adapt and find new ways to stay relevant.
The Bottom Line
The rise of local, open-source language models is like giving companies a new superpower. They can now harness the brilliance of AI while keeping their secrets safe and potentially saving some cash. As this technology grows, we’ll likely see more businesses bringing AI in-house for certain tasks.[^10]
But remember, this is just the beginning of the story. The world of AI is changing faster than you can say “artificial intelligence,” so who knows what exciting developments are just around the corner? One thing’s for sure – the future of AI is looking smarter, faster, and more personal than ever before!
[^1]: https://sideeffekt.com/2024/07/20/benefits-of-local-large-language-models/
[^2]: https://www.hostcomm.co.uk/blog/2024/open-source-vs-proprietary-llms-a-comprehensive-comparison
[^3]: https://blog.zhaw.ch/artificial-intelligence/2023/04/20/deploy-your-own-open-source-language-model/
[^4]: https://semaphoreci.com/blog/local-llm
[^5]: https://ubiops.com/openai-vs-open-source-llm/
[^6]: https://www.netguru.com/blog/small-language-models
[^7]: https://www.datacamp.com/blog/the-pros-and-cons-of-using-llm-in-the-cloud-versus-running-llm-locally
[^8]: https://blog.zhaw.ch/artificial-intelligence/2023/04/20/deploy-your-own-open-source-language-model/
[^9]: https://www.datacamp.com/blog/the-pros-and-cons-of-using-llm-in-the-cloud-versus-running-llm-locally
[^10]: https://sideeffekt.com/2024/07/20/benefits-of-local-large-language-models/