This is a technical graph. It demonstrates a task that is trivially simple to explain but rather difficult
to actually do. How does one highlight all points less than a given distance R from
an arbitrary function f(x)?
The naive approach would be to find the nearest point on f(x) to every pixel on the screen,
and then check if that distance is less or equal to R. If so, highlight the
pixel, if not, don't. The issue with this is that finding the nearest point to another on an arbitrary
function isn't particularly easy. Newton's method is nice, but doesn't always converge.
My approach instead involves defining and integrating a special function D(x,y,t)
with respect to t. This function is relatively simple: x and y are the
coordinates of any point on the screen, and t defines another point placed at x=t
on the arbitrary function f(x) we were given. It then uses a simple piecewise expression
to return 1 if the points are closer than R and 0 if they are not.
What's the result of integrating this bad boy? Since an integral is really an just an infinite sum, which
is effectively an infinite for loop, we can use it to evaluate if ANY of the infinite number of points on f(x)
come within R units of each pixel. In a world where Desmos could evaluate the integral perfectly,
it would highlight any points for which D(x,y,t) is 1 at some t.
But unfortunately, it doesn't, since integration is also a hard problem.
You're better off using a signed distance function if one exists!