How to Understand Batch Normalization?

Batch Normalization (BN) When I first tried to understand Batch Normalization (BN), I went through a lot of resources, but I still found the specific implementation and purpose somewhat unclear. A few days ago, I asked my teacher for an explanation, and he gave me an example that made it much easier to grasp. I found it very helpful, and after going back to review other materials, everything started to make sense....

August 11, 2024 · 5 min

My note on learning LeNet 5

Some words “The more I get, the less I know.”——bdim Recently, I am trying to learn more about image recognition. One of the topics that I am interested in is LeNet, which is a classic convolutional neural network. In this article, I will try to re-implement a LeNet using PyTorch and then abstractly understand some parts that I do not quite understand. The contents of this article may be inaccurate and will continue to update the details....

July 13, 2024 · 12 min

Documenting the Recovery of a Lost Video File Due to FX3 Power Failure

Today, while recording a video for a client, the fake battery ran out of power, causing the camera to shut down abruptly and the video file to be unsaved. I didn’t realize this immediately because my impression of the FX3 was that it would save the video file even if the power was forcibly cut off during recording. It wasn’t until I got home and opened the memory card that I was shocked: the first video file was missing!...

June 21, 2024 · 3 min

Getting Started with WireGuard & Configuration Memo

Recently, I wanted to deploy my music download bot to the cloud, so I set up a server on Hetzner. The quality and speed are impressive—truly German engineering! I decided to configure a WireGuard connection between my local machine and Hetzner. Despite having used WireGuard for a while, this was my first time setting it up properly. Previously, I only copied configurations from others while playing with DN42, without delving into the details....

June 10, 2024 · 4 min

Using Copilot with Local Deployment & Private Deployment Models via Ollama

Using Copilot with Local Models via Ollama Recently, Mistral AI released a highly efficient code-writing model called codestral. It’s only 22B, and my MacBook can easily handle it, using around 15GB of memory during operation. I wanted to integrate it with VSCode to replace GitHub Copilot for more secure coding. Ensure you have Ollama and the codestral model installed. It’s straightforward following the official links. To use a Copilot-like feature in VSCode, you need to install codestral and starcoder2 via Ollama, then install the Continue plugin in VSCode....

June 5, 2024 · 3 min