PROJECTS
Whether you are new to coding or are looking to improve your skills, we have opportunities for you!
Join our team →Our Teams
Client Project
Our client teams allow those with more experience to work on industry-facing projects and build on their software development foundations.
What you'll do:
- Master new tech stacks by working on projects, such as data science, full-stack development, and back-end infrastructure.
- Work side by side with engineers at top companies, learning industry workflows and practices.
- Get a chance to receive feedback and constantly iterate on your work.
Mentored Project
Our mentored team provides the opportunity for those with no prior software engineering experience to learn the skills needed to take on industry-facing projects.
What you'll do:
- Learn good coding practices and build your own personal website from scratch.
- Design, develop, and deliver a full-stack web application for a nonprofit client.
- Learn modern frameworks and technologies like React, Node, and GraphQL.
FALL '24 PROJECTS
Fall 2024 projects will be announced soon. In the meantime, take a look at last semester's projects!
Client Projects
Our client teams work with industry partners to build products ranging from full stack web development to machine learning.
Google Labs is Google's home for the latest AI experiments and technology. Codebase developed Sparky, a mobile app for Google's AI Studio, which allows developers to prototype generative AI models. Sparky makes it easier for developers to test multimodal prompts and interact with Google's Gemini API.
Meta builds technologies that help people connect, find communities and grow businesses. For our second Meta project, Codebase worked on the Facebook General Matrix Multiplication (FBGEMM) Pytorch library to extend auto-vectorization of matrix multiplication operations for ARM CPUs, enabling Meta's recommendation models to run efficiently on ARM architecture.
Meta builds technologies that help people connect, find communities and grow businesses. Meta's video team wanted to optimize algorithms for scene-change detection, an important step for video analysis. Codebase developed statistics-based algorithms and machine learning models to improve scene-change detection for Meta, doubling the performance.
Sourcegraph allows developers to rapidly search, write, and understand code by bringing insights from their entire codebase right into the editor. Our team built a machine learning training and evaluation pipeline for Cody, Sourcegraph's AI-powered code autocomplete assistant, by ranking and integrating varied context sources like Issue Tickets and Documentation, as well as evaluating its effects on LLM performance.
Mentored Project
Our mentored team focuses on learning the essentials of software development and simultaneously develops an full-stack web application for a non-profit organization.
Development Timeline
Here’s a breakdown of how our projects are run every semester.
Before the Semester
Over break, our leadership team works together to source and scope new projects with industry clients and non-profits.
Project Kickoff
After welcoming our new members and a club-wide retreat, we form project teams based on developer preferences! Everybody is onboarded onto their projects’ tech stacks.
Mid-Semester Deliverable
Around the halfway point, our teams visit our clients and present a deliverable of what they’ve accomplished so far. We demo what we’ve built, take constructive feedback, and go over next steps.
Final Deliverable
At the end of the semester, we present a polished version of our product, go over project features, and give our clients a chance to ask questions. After, we celebrate a semester’s hard work at banquet!
Technical Handoff
In the final week, our developers and PMs coordinate with our clients to clarify documentation and assist in the integration of our code into their codebase.
Past Projects
Every semester we take on five new projects with high growth tech companies. Here are some of our past projects!
DoorDash is an online food ordering and food delivery platform. Our team built a production-like testing platform from 0 to 1 to rigorously evaluate and test ML Engineers’ pytorch models. This deployed environment allowed faster model development and validation, as well as provided predictive insights to prevent issues from occurring in production systems.