psun.mehttps://psun.me/Recent content on psun.meHugo -- gohugo.ioen-usSun, 19 Jan 2020 00:00:00 +0000Binary search takes O(∛n) time; geometric bounds in computationhttps://psun.me/post/geometric/Sun, 19 Jan 2020 00:00:00 +0000https://psun.me/post/geometric/Welcome to the new roaring twenties! Will this one end like last century's did? Only time will tell!
For now let's focus on something we can analyze more surely: algorithms.
Here's a proof that binary search on an array of length $n$ takes $\newcommand{\cuberoot}[1]{\sqrt[3]{#1}} O(\cuberoot{n})$ time. Just to be specific, here's the pseudocode of the binary search (basically equivalent to C++'s std::lower_bound, which returns the first element greater than or equal to the input) that we aim to prove is $O(\cuberoot{n})$:(Finished) Sunrise Alarmhttps://psun.me/post/sunrise2/Mon, 19 Aug 2019 00:00:00 +0000https://psun.me/post/sunrise2/I actually finalized this thing a few weeks ago but haven't had the chance to post until now.
Working with the H801 controller was really smooth and it only took an hour or so to get my app to communicate with the lights. Everything's working now and I use my app every day!
Groovin
I also uploaded the source code to GitHub here.My DIY Sunrise Alarm Clockhttps://psun.me/post/sunrise1/Wed, 17 Jul 2019 00:00:00 +0000https://psun.me/post/sunrise1/In the beginning, humans coexisted with nature. Then, we fought to overpower nature. Then, having conquered nature, we sought to bring it back.
Here's how this applies specifically to my sleep:
Waking up to the sun is nice. The time of sunrise and sunset fluctuate as the seasons change, and are almost never in line with my unvarying and semi-nocturnal sleep schedule. So, I use curtains (and specifically, really thick blackout curtains) to prevent the sun from waking me up too early, and an alarm to wake me up when I want to be woken.Turn Gmail attachments to Google Docs to save spacehttps://psun.me/post/gdoctor/Fri, 15 Feb 2019 00:00:00 +0000https://psun.me/post/gdoctor/I've been using around 14.9GB of the 15GB Google storage quota for over a year now:
Every time I got really close to the 15GB mark, I'd search Gmail or Google Drive for my largest items, find some pretty useless ones, and then delete them. This worked until about a week ago, when I ran out of obvious candidates for deletion. So, I decided to implement an idea I've had for a while now: a tool that rips attachments out of emails, uploads them as Google Docs/Sheets/Slides (which don't count against the storage quota), and inserts links to the converted documents back in the original email.When typing, your fingers are like a superscalar processorhttps://psun.me/post/cpu-fingers/Sun, 06 Jan 2019 00:00:00 +0000https://psun.me/post/cpu-fingers/Imagine that our fingers collectively act as a processor, and this processor executes instructions of the form type x where x is a character on the keyboard. So, to type the phrase hello world, our instruction stream would look like:
type h type e type l type l type o type type w type o type r type l type d What characteristics does this processor have, and how does this processor relate to the ones found in our computers?FFT-Based Integer Multiplication, Part 2https://psun.me/post/fft2/Tue, 20 Nov 2018 00:00:00 +0000https://psun.me/post/fft2/In the first half of this article we discussed convolution, using FFT for convolution, and the difficulties associated with using floating-point numbers when exact integer output is required. This second half of the article is about the Schonhage-Strassen algorithm for fast integer multiplication. It multiplies $N$ bit integers in $O(N\log N\log\log N)$ time via a modular arithmetic based FFT, known as the number theoretic transform (NTT). Schonhage-Strassen is no longer asymptotically the fastest, but it held the title for over thirty years, and its ideas form the basis for newer algorithms that slightly reduce the runtime.FFT-Based Integer Multiplication, Part 1https://psun.me/post/fft1/Fri, 19 Oct 2018 00:00:00 +0000https://psun.me/post/fft1/The convolution of two discrete, finite, length-$N$ signals $f$ and $g$ (basically two length-$N$ arrays) is denoted $f*g$. The result $f*g$ is also a length-$N$ signal, with
$$(f*g)[n]=\sum_{m=0}^{N-1}f[m]g[n-m]$$
Convolutions appear in many applications:
Multiplication: Let $f$ and $g$ be the coefficient arrays of two polynomials, so $f[n]$ is the coefficient for the $x^n$ term. Then $(f*g)[n]$ equals the coefficient on the $x^n$ term when we multiply our polynomials (plus a wrapping effect, discussed later).Markov Chains, Beyond the Meanhttps://psun.me/post/markov/Tue, 02 Oct 2018 00:00:00 +0000https://psun.me/post/markov/Here's an example of a typical Markov chain problem: say there's a four-sided die. You keep rolling this die until your current roll comes up with a 2 and your previous roll was a 1. Let $R$ equal your number of rolls. What is $E[R]$?
This problem can be modeled as a Markov chain with 3 states:
In this Markov chain, $s_2$ corresponds to being finished (your past two rolls were a 1 then a 2, in that order); $s_1$ corresponds to being partly done (you just rolled a 1 and are now hoping for a 2), and $s_0$ corresponds to no progress (still hoping to roll a 1).The Poisson, Exponential, and Uniformhttps://psun.me/post/poisson1/Thu, 20 Sep 2018 00:00:00 +0000https://psun.me/post/poisson1/Until recently, I had trouble remembering the pdf of the Poisson distribution. The expression
$$\dfrac{e^{-\lambda} \lambda^n}{n!}$$
didn't make much sense to me. I knew the Poisson could be derived as the limit of the binomial distribution, but that didn't help my memory much.
Here's an alternative method of deriving the Poisson's pdf that I personally find much more intuitive. Suppose we have a Poisson process with parameter $\lambda$, and let the hits of this process occur at $t_1, t_2, …$ Define $d_i$ as the wait time between hits:Hello from my bloghttps://psun.me/post/hello-blog/Wed, 12 Sep 2018 00:00:00 +0000https://psun.me/post/hello-blog/It's a static site I ran a few WordPress sites back in high school and I was never very thrilled by the experience. The site was kind of slow, unless you installed one of those random “WordPress caching” plugins that usually did their job, but sometimes broke things. Every half a year or so, I'd update my WordPress installation (the alternative was facing exposure to unpatched exploits) and pray my plugins worked and the formatting didn't break.