Feed

This is a feed of links I've run across and found interesting or noteworthy. The images, content and opinions in them are owned by their respective authors.
A lightweight .NET client for LocalStack

A lightweight .NET client for LocalStack

localstack-dotnet - github.com

This GitHub project provides a .NET client for LocalStack, enhancing development and integration efforts.

LocalStack, .NET, client, GitHub

Read More
Daily .NET Newsletter

Daily .NET Newsletter

.NET News - dotnetnews.co

Subscribe to receive carefully curated C# and .NET articles from over 140 sources delivered daily.

C#, .NET, Azure, newsletter, curated content

Read More
C# Is Cool Again. You Can’t Avoid It Anymore.

C# Is Cool Again. You Can’t Avoid It Anymore.

Piyush Doorwar - medium.com

C# and .NET are experiencing a remarkable revival, transitioning from outdated tools to modern essentials for software engineers.

C#, .NET, software engineering, programming, technology

Read More

What is a Reverse Proxy? YARP Explained

Milan Jovanović - youtube.com

Learn the role of reverse proxies in modern .NET apps, covering security, load balancing, and setup using YARP.

Reverse Proxy, YARP, .NET, Load Balancing, Security

View
The Secret Weapon in .NET 9 for Building AI-Powered C# Apps That Actually Scale

The Secret Weapon in .NET 9 for Building AI-Powered C# Apps That Actually Scale

Sukhpinder Singh - medium.com

Discover how .NET 9 and Microsoft.Extensions.AI enable scalable C# apps and a GPT-5 pipeline using embeddings and vectors.

C#, .NET 9, Microsoft.Extensions.AI, AI, GPT-5, embeddings, vector

Read More

Ollama vs LM Studio: Which Local AI Tool Wins in 2025?

Savage Reviews - youtube.com

Compare Ollama's CLI flexibility and speed with LM Studio's GUI ease for local AI running. Choose based on your needs: automation or content creation.

Ollama, LM Studio, AI tools, CLI, GUI, local AI

View

The Easiest Ways to Run LLMs Locally - Docker Model Runner Tutorial

Tech With Tim - youtube.com

Docker's new model runner simplifies local AI model management, similar to Ollama, integrating seamlessly with Docker Desktop for easier deployment.

Docker, AI, LLMs, model runner

View

EASIEST Way to Fine-Tune a LLM and Use It With Ollama

Tech With Tim - youtube.com

Learn to fine-tune LLMs in Python for Ollama with step-by-step guidance, code examples, and setup instructions.

Python, Ollama, LLM

View

Optimize Your AI - Quantization Explained

Matt Williams - youtube.com

Learn AI model optimization via quantization techniques (q2, q4, q8) to enhance performance on basic hardware while saving costs.

AI Optimization, Quantization, Ollama, Machine Learning

View

Can We Fix Software Engineering Estimation?

Kent Beck & Kevlin Henney - youtube.com

Exploring the feasibility of software estimation and techniques in modern engineering practices.

software estimation, no estimates, techniques, software engineering

View