Working on LLMs.
Previously, I was a software engineer working on open-source Kubernetes at Google, building and maintaining Kubernetes developer tools such as minikube and skaffold. I also worked on machine learning pipelines as a maintainer of the Kubeflow project. Before Google, I worked at The Blackstone Group in NYC.
I received a BA in Mathematics from Columbia University. I have an MBA from Stanford Graduate School of Business, where I was an Arjay Miller Scholar.
- Twitter (@mattrickard)
- LinkedIn (msrickard)
- GitHub (r2d4)
- Threads (matt.rickard)
- Email ([email protected])
Software
AI
- ReLLM - Constraining LLMs pre-generation logits via regex.
- ParserLLM - Context-free grammar constraints for any LLMs.
- Kubeflow - Machine Learning Toolkit for Kubernetes
- @react-llm - Browser-based LLM inference. See chat.matt-rickard.com.
- LLaMaTab - Chrome-extension LLM inference.
- openlm - OpenAI-compatible Python library that can call any LLM.
- llm.ts - OpenAI-compatible TypeScript library (browser, node, deno)
- ScapeNet and osrs-ocr - Vision and text model for an MMORPG
Distributed Systems
- minikube: run Kubernetes locally
- skaffold: Kubernetes developer tool
- dacc: Cache-efficient, sandboxed, builds as code
- virgo: graph-based configuration language
- distroless: language runtime docker images without an operating system
- mockerfile: alternative dockerfile frontend
- docker-merge: merge docker images
- minikube-kvm-driver: manage virtual machine lifecycles with KVM
- Kubeflow - Machine Learning Toolkit for Kubernetes