Stars
High-efficiency floating-point neural network inference operators for mobile, server, and Web
Cross-platform, customizable ML solutions for live and streaming media.
Summaries and resources for Designing Machine Learning Systems book (Chip Huyen, O'Reilly 2022)
Explain complex systems using visuals and simple terms. Help you prepare for system design interviews.
This repo is meant to serve as a guide for Machine Learning/AI technical interviews.
Build ChatGPT over your data, all with natural language
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
A library for efficient similarity search and clustering of dense vectors.
Google Research
Flax is a neural network library for JAX that is designed for flexibility.
Training and evaluation pipeline for MEG and EEG brain signal encoding and decoding using deep learning. Code for our paper "Decoding speech perception from non-invasive brain recordings" published…
Official Jax Implementation of MaskGIT
A curated list of awesome open source libraries to deploy, monitor, version and scale your machine learning
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
LlamaIndex is the leading framework for building LLM-powered agents over your data.
A PyTorch implementation of Perceiver, Perceiver IO and Perceiver AR with PyTorch Lightning scripts for distributed training
Sample code and notebooks for Generative AI on Google Cloud, with Gemini on Vertex AI
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
🦜🔗 Build context-aware reasoning applications
This repo includes ChatGPT prompt curation to use ChatGPT and other LLM tools better.
ChatDBG - AI-assisted debugging. Uses AI to answer 'why'
Scalene: a high-performance, high-precision CPU, GPU, and memory profiler for Python with AI-powered optimization proposals
A playbook for systematically maximizing the performance of deep learning models.