Сводка 10.12.2023 — 17.12.2023
А теперь о том, что происходило в последнее время на других ресурсах.
Мероприятия
Обсуждения
Блог PSF
StackOverflow на русском
Популярное на GitHub
| ml-explore/mlx-examples — Examples in the MLX framework |
| SuperDuperDB/superduperdb — 🔮 SuperDuperDB. Bring AI to your database; integrate, train and manage any AI models and APIs directly with your database and your data. |
| spla-tam/SplaTAM — SplaTAM: Splat, Track & Map 3D Gaussians for Dense RGB-D SLAM |
| jackfrued/Python-100-Days — Python - 100天从新手到大师 |
| huggingface/optimum-nvidia |
| state-spaces/mamba |
| fishaudio/Bert-VITS2 — vits2 backbone with multilingual-bert |
| Anjok07/ultimatevocalremovergui — GUI for a Vocal Remover that uses Deep Neural Networks. |
| llmware-ai/llmware — Providing enterprise-grade LLM-based development framework, tools, and fine-tuned models. |
| facebookresearch/Pearl — A Production-ready Reinforcement Learning AI Agent Library brought by the Applied Reinforcement Learning team at Meta. |
| stanford-futuredata/megablocks |
| wilsonfreitas/awesome-quant — A curated list of insanely awesome libraries, packages and resources for Quants (Quantitative Finance) |
| bentoml/OpenLLM — Operating LLMs in production |
| swisskyrepo/PayloadsAllTheThings — A list of useful payloads and bypass for Web Application Security and Pentest/CTF |
| vllm-project/vllm — A high-throughput and memory-efficient inference and serving engine for LLMs |
| ise-uiuc/magicoder — Magicoder: Source Code Is All You Need |
| Lightning-AI/lit-gpt — Hackable implementation of state-of-the-art open-source LLMs based on nanoGPT. Supports flash attention, 4-bit and 8-bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed. |
| kyegomez/Gemini — The open source implementation of Gemini, the model that will "eclipse ChatGPT" by Google |
| Asabeneh/30-Days-Of-Python — 30 days of Python programming challenge is a step-by-step guide to learn the Python programming language in 30 days. This challenge may take more than100 days, follow your own pace. These videos may help too: https://www.youtube.com/<...>UC7PNRuno1rzYPb1xLa4yktw |
| karpathy/nanoGPT — The simplest, fastest repository for training/finetuning medium-sized GPTs. |
| lukas-blecher/LaTeX-OCR — pix2tex: Using a ViT to convert images of equations into LaTeX code. |
На заметку
Зарегистрированные пользователи могут получать еженедельный дайджест обновлений на сайте.