diff options
4 files changed, 5 insertions, 5 deletions
diff --git a/content/posts/abusing-systemd-nspawn-with-nested-containers.md b/content/posts/abusing-systemd-nspawn-with-nested-containers.md index c82b35b..cdf6b8d 100644 --- a/content/posts/abusing-systemd-nspawn-with-nested-containers.md +++ b/content/posts/abusing-systemd-nspawn-with-nested-containers.md @@ -7,7 +7,7 @@ tags: ["linux", "containers", "systemd"] --- -systemd-nspawn is some pretty insane stuff that you probably already have installed on your computer. Today's mission to wreck havoc will be to replicate an old project I did a few years ago, [Arch All the Way Down](https://git.exozy.me/a/Arch-All-the-Way-Down), but using some real power tools. I was able to achieve 4 nested containers with Docker and my old laptop. It shouldn't be too hard to improve that! +systemd-nspawn is some pretty insane stuff that you probably already have installed on your computer. Today's mission to wreck havoc will be to replicate an old project I did a few years ago, [Arch All the Way Down](https://git.unnamed.website/arch-all-the-way-down), but using some real power tools. I was able to achieve 4 nested containers with Docker and my old laptop. It shouldn't be too hard to improve that! There's a really [cool trick you can do with systemd-nspawn](https://0pointer.net/blog/running-an-container-off-the-host-usr.html) where you can almost instantly create a container based on your existing root directory: ```bash diff --git a/content/posts/bad-apple-animated-qr-code.md b/content/posts/bad-apple-animated-qr-code.md index 1bf1dde..48576d8 100644 --- a/content/posts/bad-apple-animated-qr-code.md +++ b/content/posts/bad-apple-animated-qr-code.md @@ -19,6 +19,6 @@ Here's a demo: {{< video "/vid/bad-apple-animated-qr-code.mp4" >}} -And here's our [actual final report](/src/6_8301_Project.pdf) (read the appendix) and [code for that video](https://git.exozy.me/k/6.8301-Project/src/branch/alg) and (unrelated) [Bad Apple sung in ancient Egyptian](https://www.bilibili.com/video/BV1YD4y1f71Y/). +And here's our [actual final report](/src/6_8301_Project.pdf) (read the appendix) and [code for that video](https://git.unnamed.website/6.8301-project/) and (unrelated) [Bad Apple sung in ancient Egyptian](https://www.bilibili.com/video/BV1YD4y1f71Y/). It took Kevin and me forever to come up with a name, and I was a huge proponent of "Epilepsend" but we ended up with the much less aggressive "SWANTV". Still, it'll always be Epilepsend in my heart since our project is not graceful like a swan. It's more like screaming at 30 FPS. And while working on this project, I think I got too engrossed and accidentally started saying things like "I can walk at 30 FPS" when I meant to say I can speed-walk quickly. Also, the Epilepsend codes were slighlty headache-inducing at the beginning and I saw them flashing around when I closed my eyes at night, but now I think I've become desensitized to them. diff --git a/content/posts/extreme-epilepsend.md b/content/posts/extreme-epilepsend.md index d693e5c..909dbbc 100644 --- a/content/posts/extreme-epilepsend.md +++ b/content/posts/extreme-epilepsend.md @@ -7,7 +7,7 @@ tags: ["computer-vision", "multiprocessing", "epilepsend", "video"] --- -Kevin and I created [Epilepsend](/posts/bad-apple-animated-qr-code/) for an MIT final project a few months ago, and the thing about final projects is the deadline is always a day too early, no matter how early you start. Our project couldn't even break the 0.3 mbps barrier until two days before the deadline! Fortunately, I discovered that an earlier commit had cropped the camera feed to avoid distortion, at the cost of significantly reducing the camera acuity. [Removing that code](https://git.exozy.me/k/6.8301-Project/commit/a1d77a466dcc694f0101d2588529ada8c858ed94) boosted our bandwidth by an order of magnitude. So I've been wondering: Are there any other bandwidth-boosting tweaks out there that we just didn't have time to try? +Kevin and I created [Epilepsend](/posts/bad-apple-animated-qr-code/) for an MIT final project a few months ago, and the thing about final projects is the deadline is always a day too early, no matter how early you start. Our project couldn't even break the 0.3 mbps barrier until two days before the deadline! Fortunately, I discovered that an earlier commit had cropped the camera feed to avoid distortion, at the cost of significantly reducing the camera acuity. [Removing that code](https://git.unnamed.website/6.8301-project/commit/?id=a1d77a466dcc694f0101d2588529ada8c858ed94) boosted our bandwidth by an order of magnitude. So I've been wondering: Are there any other bandwidth-boosting tweaks out there that we just didn't have time to try? I initially wanted Epilepsend to be lightweight enough to run on my laptop, but it has a bad habit of scorching my CPU. Sure, it can hit 2 mbps for a few seconds, and then the temperature skyrockets and Epilepsend starts lagging behind the camera stream. For the *Bad Apple!!* demo, I had to use a 32-thread Ryzen 3950X CPU instead. More compute never hurts, so I recently tried some tweaks to fully embrace the more powerful CPU. @@ -19,7 +19,7 @@ The monitor only shows the entire complete frame for 1/60 seconds! Consequently, And if you have a beefy CPU and cooling, it's completely useless. Instead, just try to decode every frame. Easy as that. -Another conceptually simple tweak I tried was putting the Reed-Solomon decoding in its own thread. Welp, Python's GIL decided to ruin my day so I had to use both the `threading` and `multiprocessing` modules, but it [worked in the end](https://git.exozy.me/k/6.8301-Project/commit/078079bbd5596c17322b6329c6fce66d997f8e82). CPU usage went up by a factor of three! Hooray! +Another conceptually simple tweak I tried was putting the Reed-Solomon decoding in its own thread. Welp, Python's GIL decided to ruin my day so I had to use both the `threading` and `multiprocessing` modules, but it [worked in the end](https://git.unnamed.website/6.8301-project/commit/?id=078079bbd5596c17322b6329c6fce66d997f8e82). CPU usage went up by a factor of three! Hooray! On the physical side of things, I built a structure out of random books to hold my phone so my arms don't get sore. I also tried placing a large foam board, held in place by a plush unicorn, behind the camera and turning up the encoder's screen brightness to avoid stuff from the background appearing as reflections in the encoder's screen. diff --git a/content/posts/solving-shortest-paths-with-transformers.md b/content/posts/solving-shortest-paths-with-transformers.md index 97f5e59..32431e7 100644 --- a/content/posts/solving-shortest-paths-with-transformers.md +++ b/content/posts/solving-shortest-paths-with-transformers.md @@ -1,7 +1,7 @@ --- title: "Solving Shortest Paths With Transformers" date: 2024-12-11T11:29:07-05:00 -description: "Final project for the MIT 6.7960 class by Anthony Wang, Alek Westover, and Kevin Zhao" +description: "Final project for the MIT 6.7960 class by Anthony Wang, Alek Westover, and Kevin Zhao (the other one)" type: "post" tags: ["machine-learning", "algorithms", "transformers", "graphs"] --- |