Project Overview
Problem Statement
At Colby College, faculty members across various disciplines are increasingly looking to integrate cutting-edge Large Language Models (LLMs) into their workflows. These models, both open-source and commercially licensed, offer significant potential for research, teaching, and administrative tasks. However, the process of accessing and utilizing these LLMs has become a bottleneck, particularly for the Davis AI team.
Challenges Faced by Davis AI:
- High Demand for LLM Access: A growing number of faculty members are interested in experimenting with and deploying LLMs. However, accessing these models—especially the enterprise versions of commercial LLMs like ChatGPT—requires faculty to go through Davis AI, leading to delays and an overwhelming workload for the team.
- Complexity of Open-Source LLMs: Open-source LLMs, while powerful, require specialized knowledge and resources to set up. Faculty members often rely on Davis AI for high-performance computing resources, model selection, and environment configuration, adding further strain to the team.
These challenges highlighted a clear need for a more streamlined and scalable solution to empower faculty while reducing the dependency on Davis AI for routine tasks.
Solution Overview
The Colby LLM Playground was conceived as a centralized platform to address these challenges by providing seamless access to both open-source and commercial LLMs. This solution not only alleviates the workload on Davis AI but also democratizes access to advanced AI tools for all faculty members.
Key Features:
- Centralized Access: The Playground offers a single, unified access point for all available LLMs. Faculty members can utilize their unique API keys to access and experiment with any model without needing to navigate the complexities of individual setups.
- Automated User Management: The creation and management of unique user accounts are fully automated, allowing Davis AI to focus on maintaining enterprise licenses and ensuring smooth operation, rather than getting bogged down in administrative tasks.
- Open and Close-Sourced Models: The platform integrates both open-source LLMs hosted via OctoAI and commercially licensed models, providing users with a wide range of tools to fit their specific needs.
Implementation
Design and Technical Overview
The Colby LLM Playground was built with scalability and ease of use in mind:
- Frontend: A single-page application developed using React.js, ensuring a smooth and responsive user experience.
- Backend: A robust backend infrastructure powered by Flask and PostgreSQL, hosted on AWS for reliability and performance.
- Cloud Integration: OctoAI was chosen as the cloud provider for hosting open-source LLMs, offering the necessary computational power and flexibility to handle diverse model requirements.