Images to inference with no labeling (use foundation models to train supervised models).
-
Updated
Sep 19, 2024 - Python
Images to inference with no labeling (use foundation models to train supervised models).
Awesome Knowledge Distillation
Mechanistically interpretable neurosymbolic AI (Nature Comput Sci 2024): losslessly compressing NNs to computer code and discovering new algorithms which generalize out-of-distribution and outperform human-designed algorithms
Use LLaMA to label data for use in training a fine-tuned LLM.
Use AWS Rekognition to train custom models that you own.
Autodistill Google Cloud Vision module for use in training a custom, fine-tuned model.
Repository for the publication "AutoGraph: Predicting Lane Graphs from Traffic"
Our open source implementation of MiniLMv2 (https://aclanthology.org/2021.findings-acl.188)
A framework for knowledge distillation using TensorRT inference on teacher network
The Codebase for Causal Distillation for Task-Specific Models
🚀 PyTorch Implementation of "Progressive Distillation for Fast Sampling of Diffusion Models(v-diffusion)"
The Codebase for Causal Distillation for Language Models (NAACL '22)
[Master Thesis] Research project at the Data Analytics Lab in collaboration with Daedalean AI. The thesis was submitted to both ETH Zürich and Imperial College London.
Matching Guided Distillation (ECCV 2020)
Model distillation of CNNs for classification of Seafood Images in PyTorch
Awesome Deep Model Compression
Add a description, image, and links to the model-distillation topic page so that developers can more easily learn about it.
To associate your repository with the model-distillation topic, visit your repo's landing page and select "manage topics."