Plight of Chinese Developers

let-the-bullets-fly

Recently, due to personal needs, I revisited domestic VPS and “Serverless” (Cloud Functions). Services that worked perfectly fine during local debugging encountered issues after submission—either dependencies couldn’t be installed, leading to the need to switch to domestic repositories one by one, or the process was painfully slow, wasting a considerable amount of time on network waiting.

I deeply realized that the three major obstacles hindering the progress of developers in mainland China are, in my opinion, the network, the network, and damn it, the network!

Of course, this article is not about teaching you how to solve network problems but rather how I cope with such situations.

Approach

The entry point is my blog because I previously migrated to Cloudflare Pages, specifically implementing global acceleration. After my analysis, the initial speed of mainland CDN sourcing from Cloudflare Pages is not ideal. Creating a mainland COS (Cloud Object Storage) copy doesn’t bring significant benefits.

Moreover, the GitHub ecosystem is too convenient, especially when combined with GitHub Actions. Although I tried Tencent Cloud’s Serverless, the experience wasn’t good, mainly due to network issues preventing seamless integration into the GitHub ecosystem. Every change had to be compiled and packaged locally using Go, manually uploaded to the official console, and then various trigger conditions had to be edited. What an experience! 😰

Objective

Simple and effective.

Based on current data statistics, it turns out that global acceleration is no longer necessary. Therefore, I ultimately returned to GitHub + Cloudflare Pages for code management and service building around GitHub.

Worried about access speed? If you can use GitHub normally, the speed won’t be slow 🤓.

Operation

For this site, there are only two points that need to be compatible: the site content itself and image resources.

Both are hosted on Qiniu and support HTTPS. As they are replicas with their own caching systems, there may be brief inconsistencies each time there’s an update, which I currently address by calling their APIs through GitHub Actions 😂.

For the site content itself, thanks to the previous DNS-based global acceleration setup, I simply removed the acceleration strategy.

For image resources, after comparing various solutions, I found that my requirements for image resources are minimal. Until now, there are less than 90 images, so they can be managed alongside the blog. I encode them as lossless WebP and store them in the repository.

webp-usage

Image Processing

  1. Image resource acquisition: Qiniu COS’s console does not support batch downloading of data or direct downloading of directories. A script tool is provided for batch downloading.

  2. Data encoding to WebP: To minimize adaptation work, the directory structure remains unchanged. I use cwebp to encode files into WebP format (using -lossless for lossless encoding). I wrote a simple script for quick processing.

<?php

$path = '~/Downloads/images';
$dirInfo = dirToArray($path);

$a = genBlockContent($dirInfo, $path);

var_dump($a);

function genBlockContent($dirInfo, $path)
{
    $result = [];
    foreach ($dirInfo as $key => $info) {
        if (is_array($info)) {
            $result = array_merge($result, genBlockContent($info, $path . DIRECTORY_SEPARATOR . $key));
            continue;
        }
        $p = $path . DIRECTORY_SEPARATOR . $info;
        $code = -1;
        $t = '';
        exec(sprintf('/usr/local/bin/cwebp -lossless %s -o %s', $p, $path . DIRECTORY_SEPARATOR . pathinfo($p, PATHINFO_FILENAME) . '.webp'), $t, $code);
        $result[$p] = $code;
        if ($code == 0) {
            $result[$p] = (int)unlink($p);
        }
    }
    return $result;
}

function dirToArray($dir)
{
    $filter = ['.', '..', '.git', '.DS_Store'];
    $keepExt = ['jpg', 'png'];

    $result = [];
    $cdir = scandir($dir);
    foreach ($cdir as $key => $value) {

        if (in_array($value, $filter, true)) {
            continue;
        }

        if (is_dir($dir . DIRECTORY_SEPARATOR . $value)) {
            $result[$value] = dirToArray($dir . DIRECTORY_SEPARATOR . $value);
            continue;
        }

        if (in_array(pathinfo($dir . DIRECTORY_SEPARATOR . $value, PATHINFO_EXTENSION), $keepExt, true)) {
            $result[] = $value;
        }
    }

    return $result;
}
  1. Resource building and processing: I directly use GitHub Actions and Cloudflare Wrangler tool for custom deployment.

  2. DNS switch: Finally, I switch the DNS from Qiniu to Cloudflare.

Conclusion

With this, the migration is complete. I only need to maintain one repository on GitHub, and everything else is automatically handled by Cloudflare. I also found that Cloudflare’s handling of the HTTP protocol is very standard and excellent. I learned a lot about protocol details, and it’s more reliable and stress-free than handling it myself 😴.

To facilitate image processing, I conveniently modified the image plugin picgo-plugin-compress-webp-lossless for PicGo. The main adjustment was setting the WebP encoding parameters to lossless for storage, as compression was not the primary goal. Already published on npm, it can be installed directly in the plugin settings.

Final Thoughts

Every time I encounter an issue due to network problems or spend a significant amount of effort solving “non-technical problems” caused by network issues, it makes me distressed. Time is wasted on meaningless content, and sometimes it feels like it’s not worth it, maybe even less productive than playing with a mobile phone.

From once smooth to current obstacles, it’s not a network problem; it’s my problem, and maybe your problem too.

After completing the operation, my actions won’t bring anything new to the domestic network. On the contrary, it might reduce its diversity a bit, but who cares?

But I know I’m not the first, and certainly not the last.